Loss functions

Loss functions play an important part in training our network. We haven't discussed them in much detail, but their role is to tell our model when it gets things wrong, so it can learn from its mistakes.

In this example, we are using a version of cross-entropy loss that has been modified to be as efficient as possible.

It should be noted that cross-entropy loss would typically be expressed in pseudocode, such as the following:

crossEntropyLoss = -1 * sum(actual_y * log(predicted_y))

However, in our case, we are going for a simpler version:

loss = -1 * mean(actual_y * predicted_y)

So, we are implementing the loss function as follows:

losses, err := gorgonia.HadamardProd(m.out, y)
if err != nil {
log.Fatal(err)
}
cost := gorgonia.Must(gorgonia.Mean(losses))
cost = gorgonia.Must(gorgonia.Neg(cost))

// we wanna track costs
var costVal gorgonia.Value
gorgonia.Read(cost, &costVal)

As an exercise, you can modify the loss function to the more commonly used cross-entropy loss and compare your results.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.123.155