Regularization for neural networks

L1 and L2 regularizations are widely used to train neural networks and are usually called weight decay. Data augmentation also plays an essential role in the training processes for neural networks. There are other regularization methods that can be used neural networks. For example, Dropout is a particular type of regularization that was developed especially for neural networks. This algorithm randomly drops some neural network nodes; it makes other nodes more insensitive to the weights of other nodes, which means the model becomes more robust and stops overfitting.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.226.251.206