Step functions

Of course, the most basic activation function would be a step function. If the value of x is more than a fixed value, a, then y is either 0 or 1, as shown in the following code:

func step(x) {
if x >= 0 {
return 1
} else {
return 0
}
}

As you can see in the following diagram, the step function is extremely simple; it takes a value and then returns 0 or 1:

This is a very simple function and one that is not particularly useful for deep learning. This is because the gradient of this function is a constant zero, meaning that, when we are doing backpropagation, it will constantly produce zeroes, which results in very little (if any at all) improvement when we are performing backpropagation.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.14.172.93