Dropout

Dropout is a computationally inexpensive but powerful method of regularizing in deep neural networks. It can be applied separately to both the input layer and the hidden layers. Dropout randomly masks the output of a fraction of nodes from a layer by setting their output to zero during the forward pass. It's equivalent to removing a fraction of nodes from a layer and creating a new neural network with fewer nodes. Typically, at the input layer 0.2 nodes are dropped out and in hidden layers up to 0.5 fraction of nodes are dropped.

Model averaging (ensemble methods) is heavily used in ML for reducing generalization error by combining the output of various models. Bagging is one ensemble method where k different datasets are constructed by random sampling with replacement from the training set and separate k models are trained on each of them. In particular, for a regression problem, the final output of the model is an average of the outputs of the k models. There are other combination strategies as well. 

Dropout can be also thought as a model-averaging method where many models are created by changing the number of active nodes at various layers of the base model on which dropout is applied.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.58.77.161