Getting Ready...

Before starting with the recipe, let's explore the KL divergence, DKL a little more. It is a non-symmetric measure of the difference between two distributions, in our case,  ρ and ρ_hat. When  ρ and ρ_hat  are equal, then it is zero, otherwise, it increases monotonically as ρ_hat diverges from ρ. Mathematically, it is expressed as follows:

Here is the plot of DKL for a fixed ρ=0.3, where we can see that when ρ_hat=0.3, DKL=0; otherwise, it is increasing monotonically on both sides:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.13.76