Saturday, December 21, 2024

TensorFlow Example 1

 Here is a basic example of using TensorFlow for creating a simple neural network to classify the famous MNIST dataset of handwritten digits. This will provide a basic introduction to TensorFlow and how to work with it.

Step 1: Install TensorFlow

If you haven't installed TensorFlow yet, you can install it using pip:

pip install tensorflow

Step 2: Import Necessary Libraries

import tensorflow as tf
from tensorflow.keras import layers, models
import numpy as np
import matplotlib.pyplot as plt

Step 3: Load and Preprocess the Data

MNIST is a dataset of 60,000 training images and 10,000 test images of handwritten digits (0–9). We will use TensorFlow's keras API to load it.

# Load the MNIST dataset
mnist = tf.keras.datasets.mnist

# Split the dataset into training and test data
(train_images, train_labels), (test_images, test_labels) = mnist.load_data()

# Normalize the image data to values between 0 and 1
train_images, test_images = train_images / 255.0, test_images / 255.0

# The images are 28x28 pixels, and we need to reshape them to (28, 28, 1) to fit the CNN input format
train_images = train_images.reshape((train_images.shape[0], 28, 28, 1))
test_images = test_images.reshape((test_images.shape[0], 28, 28, 1))

# Check the shape of the data
print("Train data shape:", train_images.shape)
print("Test data shape:", test_images.shape)

Step 4: Build the Neural Network Model

In this example, we'll create a simple Convolutional Neural Network (CNN) model to classify the digits.

# Build the CNN model
model = models.Sequential()

# Add a convolutional layer with 32 filters and a 3x3 kernel
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))
model.add(layers.MaxPooling2D((2, 2)))

# Add a second convolutional layer
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))

# Flatten the output of the previous layer
model.add(layers.Flatten())

# Add a fully connected layer with 64 units
model.add(layers.Dense(64, activation='relu'))

# Output layer with 10 units (for 10 classes: digits 0–9) and softmax activation
model.add(layers.Dense(10, activation='softmax'))

# Display the model summary
model.summary()

Step 5: Compile the Model

Now, we'll compile the model by specifying the optimizer, loss function, and evaluation metric.

# Compile the model
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

Step 6: Train the Model

Now, we will train the model using the training data.

# Train the model
history = model.fit(train_images, train_labels, epochs=5, batch_size=64, validation_split=0.2)

Step 7: Evaluate the Model

After training, we can evaluate the model on the test set to see how well it performs.

# Evaluate the model on the test data
test_loss, test_acc = model.evaluate(test_images, test_labels)
print(f"Test accuracy: {test_acc:.4f}")

Step 8: Make Predictions (Optional)

You can make predictions using the trained model and visualize the results.

# Make predictions on the test set
predictions = model.predict(test_images)

# Visualize the first test image and its predicted label
plt.imshow(test_images[0].reshape(28, 28), cmap='gray')
plt.title(f"Predicted Label: {np.argmax(predictions[0])}, Actual Label: {test_labels[0]}")
plt.show()

Conclusion:

In this example, you have:

  • Loaded and preprocessed the MNIST dataset.
  • Built a simple CNN model using tensorflow.keras.
  • Compiled and trained the model on the training data.
  • Evaluated its performance on the test data.
  • Made predictions and visualized results.

You can experiment with different architectures, add dropout layers, change hyperparameters, or try using other datasets. TensorFlow is flexible and allows you to easily extend this simple example for more complex problems.

No comments:

Post a Comment

How will AI transform your life in the next 5 years?

 AI is already transforming how we live and work, and over the next 5 years, this transformation is expected to accelerate in several key ar...