Neural Networks for Machine Learning

This chapter is the introduction to the world of deep learning, whose methods make it possible to achieve the state-of-the-art performance in many classification and regression fields often considered extremely difficult to manage (such as image segmentation, automatic translation, voice synthesis, and so on). The goal is to provide the reader with the basic instruments to understand the structure of a fully connected neural network and model it using the Python tool Keras (employing all the modern techniques to speed the training process and prevent overfitting).

In particular, the topics covered in the chapter are as follows:

  • The structure of a basic artificial neuron
  • Perceptrons, linear classifiers, and their limitations
  • Multilayer perceptrons with the most important activation functions (such as ReLU)
  • Back-propagation algorithms based on stochastic gradient descent (SGD) optimization method
  • Optimized SGD algorithms (Momentum, RMSProp, Adam, AdaGrad, and AdaDelta)
  • Regularization and dropout
  • Batch normalization
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.135.247.68