Cost function

The cost function we will be using is called multinomial cross-entropy. Multinomial cross-entropy is really just a generalization of the binary cross-entropy function that we saw in Chapter 4, Using Keras for Binary Classification.

Instead of just showing you categorical cross-entropy, let's look at them both together. I'm going to assert they are equal, and then explain why:

The preceding equation is true (when m=2)

OK, don't freak out. I know, that's a whole bunch of math. The categorical cross-entropy equation is the one that exists all the way on the right. Binary cross-entropy is next to it. Now, imagine a situation where m=2. In this case you can probably see that summing  for both j = 0 and j = 1, for each value in i would be equal to the result you'd get from binary cross-entropy. Hopefully that reduction is enough to make sense of categorical cross-entropy. If not, I'd suggest picking a few values and coding it up. It will only take a second and you'll thank me later!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.47.208