Build the model

Step 1: Import the necessary libraries

Python3




import tensorflow as tf
from tensorflow import keras
import numpy as np
import matplotlib.pyplot as plt
import os
import subprocess


Step 2: Load the dataset

Python3




#Load dataset, dataset is cifar 10. It is open source
(data_train, label_train), (data_test, label_test) = keras.datasets.cifar10.load_data()
 
print('Train data :',data_train.shape,label_train.shape)
print('Test data :',data_test.shape,label_test.shape)


Output:

Train data : (50000, 32, 32, 3) (50000, 1)
Test data : (10000, 32, 32, 3) (10000, 1)

Number of classes

Python3




classes = ['aeroplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']
num_classes = len(classes)
num_classes


Output:

10

Step 3: Build the model

Python3




model = keras.models.Sequential()
model.add(keras.layers.Conv2D(32, (3, 3), activation='relu', input_shape=(32, 32, 3)))
model.add(keras.layers.MaxPooling2D((2, 2)))
model.add(keras.layers.Conv2D(64, (3, 3), activation='relu'))
model.add(keras.layers.MaxPooling2D((2, 2)))
model.add(keras.layers.Conv2D(128, (3, 3), activation='relu'))
model.add(keras.layers.Conv2D(128, (3, 3), activation='relu'))
model.add(keras.layers.MaxPooling2D((2, 2)))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(256, activation='relu'))
model.add(keras.layers.Dense(num_classes, activation='softmax'))
# Compile the model
model.compile(optimizer='adam',
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              metrics=[keras.metrics.SparseCategoricalAccuracy()])
# Model summary
model.summary()


Output:

Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 30, 30, 32) 896

max_pooling2d (MaxPooling2D (None, 15, 15, 32) 0
)

conv2d_1 (Conv2D) (None, 13, 13, 64) 18496

max_pooling2d_1 (MaxPooling (None, 6, 6, 64) 0
2D)

conv2d_2 (Conv2D) (None, 4, 4, 128) 73856

conv2d_3 (Conv2D) (None, 2, 2, 128) 147584

max_pooling2d_2 (MaxPooling (None, 1, 1, 128) 0
2D)

flatten (Flatten) (None, 128) 0

dense (Dense) (None, 256) 33024

dense_1 (Dense) (None, 10) 2570

=================================================================
Total params: 276,426
Trainable params: 276,426
Non-trainable params: 0
_________________________________________________________________

Step 4: Train the model

Python3




epoch = 4
model.fit(data_train, label_train, epochs=epoch)


Output:

Epoch 1/4
1563/1563 [==============================] - 14s 8ms/step - loss: 1.6731 - sparse_categorical_accuracy: 0.4234
Epoch 2/4
1563/1563 [==============================] - 14s 9ms/step - loss: 1.2742 - sparse_categorical_accuracy: 0.5499
Epoch 3/4
1563/1563 [==============================] - 17s 11ms/step - loss: 1.1412 - sparse_categorical_accuracy: 0.5994
Epoch 4/4
1563/1563 [==============================] - 17s 11ms/step - loss: 1.0541 - sparse_categorical_accuracy: 0.6328
<keras.callbacks.History at 0x7f6f5c532920>

Step 5: Evaluate the model

Python3




loss, accuracy_ = model.evaluate(data_test, label_test)
print()
print(f"Test accuracy: {accuracy_*100}")
print(f"Test loss: {loss}")


Output:

313/313 [==============================] - 1s 4ms/step - loss: 1.0704 - sparse_categorical_accuracy: 0.6358

Test accuracy: 63.58000040054321
Test loss: 1.0704132318496704

Step 6: Save the model

Python3




import tempfile
import os
 
# Specify the current version
curr_version = 1
 
# Create a temporary directory
my_dir = tempfile.gettempdir()
 
# Create a directory path for the current version
path = os.path.join(my_dir, str(curr_version))
print(f'Export Path = {path}\n')
 
# Saving the current model
model.save(path, overwrite=True, include_optimizer=True, save_format=None, signatures=None, options=None)
 
print('\nSaved model:')
 
# Listing the contents of the export path
import glob
file_list = glob.glob(f'{path}/*')
for file in file_list:
    print(file)


Output:

Export Path = /tmp/1
INFO:tensorflow:Assets written to: /tmp/1/assets

Saved model:
/tmp/1/fingerprint.pb
/tmp/1/saved_model.pb
/tmp/1/assets
/tmp/1/variables
/tmp/1/keras_metadata.pb

Serving a TensorFlow Model

TensorFlow Serving stands as a versatile and high-performance system tailored for serving machine learning models in production settings. Its primary objective is to simplify the deployment of novel algorithms and experiments while maintaining consistent server architecture and APIs. While it seamlessly integrates with TensorFlow models, TensorFlow Serving’s adaptability also enables the service to be expanded for serving diverse model types and data beyond TensorFlow.

Similar Reads

TensorFlow Serving

Tensorflow servable architecture consists of four components: servables, loaders, managers, and core. The explanation for the same are:...

Lifecycle of a Servable

Lifecycle of a Servable is defined as:...

Requirements

To run tensorflow servings, you need to have Ubuntu or docker. DO NOT TRY TO INSTALL APT-GET AS IT WILL NOT WORK, TRY DOCKER INSTEAD FOR YOUR OS....

Build the model

Step 1: Import the necessary libraries...

Deployment

...