Impact on change in batch size

As discussed earlier, the lesser the batch size, the more often the weights get updated in a given neural network. This results in a lesser number of epochs required to achieve a certain accuracy on the network. At the same time, if the batch size is too low, the network structure might result in instability in the model.

Let's compare the previously built network with a lower batch size in one scenario and a bigger batch size in the next scenario:

Note that in the preceding scenario, where the batch size is very high, the test dataset accuracy at the end of 300 epochs is only 89.91%.

The reason for this is that the network with batch size 1,024 would have learned the weights much faster than the network with batch size 30,000, as the number of weight updates is much higher when the batch size is lower.

In the next scenario, we will reduce the batch size to a very small number to see the impact on network accuracy:

Note that while accuracy improves considerably very quickly to 97.77% within 10 epochs itself, it takes significant time to produce results, as the number of weight updates is high per epoch. This results in more calculations and thus more time to execute.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.149.236.69