There's more...

Neural networks have been employed in a variety of tasks. These tasks can be broadly classified into two categories: function approximation (regression) and classification. Depending on the task at hand, one activation function may be better than the other. Generally, it is better to use ReLU neuron for hidden layers. For classification tasks, softmax is normally a better choice, and for regression problems, it is better to use sigmoid or hyperbolic tangent.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.206.43