Training and evaluating the classifier

In order to train the classifier, we have to configure it for training. We have to specify an objective function (the loss function) and a training method. Additionally, we might want to specify some metrics in order to see how the model performs. We can configure the classifier using the compile method of the model:

model.compile(loss="categorical_crossentropy", optimizer="adam", metrics=["categorical_accuracy","top_k_categorical_accuracy"])

We have passed metrics as categorical_accuracy, which will show which part of the dataset is classified with the right class. Besides this, we have passed one more metric called top_k_categorical_accuracy, which shows which part of the dataset is correct in the top k prediction of the network.

The default value of k is five, so the metric shows which part of the dataset is in the most probable five classes predicted by the neural network. We have also passed optimizer="adam", which forces the model to use Adam Optimizer as a training algorithm. You will learn how neural networks are usually trained in the Understanding backpropagation section.

Before training, we also split the dataset into training and test sets in order to see how the network performs on unseen data:

evaluate = inp_ds.take(1000)
train = inp_ds.skip(1000).shuffle(10**4)

Here, we take the first 1000 elements of the dataset for test purposes. And the remaining part is used for training.

The training part is mixed by calling the shuffle method, which will make sure that we have a different order of the data in each epoch of training. Finally, we train our network by calling the fit method of the dataset and then evaluate this on the validation set:

model.fit(train.batch(32), epochs=4)
model.evaluate(valid.batch(1))

First, the fit method accepts the dataset itself, which we pass with batches of 32. The latter means that, on each step of the training process, 32 images from the dataset will be used. 

We have also passed a number of epochs, which means that our dataset will be iterated for 4 times until the training procedure stops. The output of the last epoch looks as follows:

Epoch 4/4
84/84 [==============================] - 13s 156ms/step - loss: 0.0834 - categorical_accuracy: 0.9717 - top_k_categorical_accuracy: 1.0000

Our categorical accuracy on the train set is more than 97%. So, we are pretty good at differentiating between cats and dogs. Of course, the top-K accuracy will be 100 percent as we have just two classes. Now, let's see how we are performing on the validation set.

After training, the model is evaluated and you should obtain results similar to the test set:

model.evaluate(valid.batch(1))

The output is given as follows:

1000/1000 [==============================] - 9s 9ms/step - loss: 0.0954 - categorical_accuracy: 0.9730 - top_k_categorical_accuracy: 1.0000

We again get the categorical accuracy of more than 97%. Therefore, our model does not overfit and performs well on the test set.

If we train on breeds, the same output for training looks as follows:

Epoch 4/4
84/84 [==============================] - 13s 155ms/step - loss: 0.3272 - categorical_accuracy: 0.9233 - top_k_categorical_accuracy:
0.9963

Meanwhile, the output for testing looks like this: 

1000/1000 [==============================] - 11s 11ms/step - loss: 0.5646 - categorical_accuracy: 0.8080 - top_k_categorical_accuracy: 0.9890

For breeds, we get worse results, which is expected as it is much more difficult to differentiate a breed than just state whether it is a cat or a dog. In any case, the model does not perform too badly. Its first-attempt guess is more than 80 percent right, and we can also be about 99 percent sure that it will guess the breed if it has 5 attempts.

In this section, we have learned how to use a pretrained classifier network to build a new classifier. In the next section, let's move ahead with our deep learning journey and create an object localization network using the same base modela task that the base model was never trained to accomplish.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.43.122