How it works...

Using ConvNets, we increased our performance on the MNIST dataset reaching almost 95 percent accuracy. Our ConvNet consists of two layers combining convolutions, ReLU, and maxpooling, followed by two fully connected layers with dropout. Training happens in batches of size 128 with Adam used as an optimizer, a learning rate of 0.001, and a maximum number of 500 iterations.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.14.214.177