Softmax

Sometimes we need more than two levels for the output of the activation function. Softmax is an activation function that provides us with more than two levels for the output. It is best suited to multiclass classification problems. Let's assume that we have n classes. We have input values. The input values map the classes as follows:

x = {x(1),x(2),....x(n)}

Softmax operates on probability theory. The output probability of the eth class of the softmax is calculated as follows:

 

For binary classifiers, the activation function in the final layer will be sigmoid, and for multiclass classifiers it will be softmax.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.220.96.214