Hyperparameters of the neural network

The architecture-level parameters of a neural network, such as the number of hidden layers, number of units per hidden layer, and training-related parameters, such as learning rate, optimizer algorithm, optimizer parameter—momentum, L1/L2 regularizer, and dropout, are collectively called the hyperparameters of the neural network. The weights of the neural network are called the parameters of the neural network. Some hyperparameters affect the time and cost of training the algorithms and some affect the generalization performance of the model.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.223.237.131