How it works...

All of this work, and what do we have to show for it? Well, I decided to make a little graphic to show you the fruits of your labor throughout this chapter. Here are the MNIST digit generator results for 40,000 epochs:

One of the mind-blowing things about these results to me is that you can see what kind of data the first epoch produces—it's essentially noise. As the adversarial training continues, the generator eventually learns the ability to move the pixels to the center, but it is hard to discern any noticeable digits at the 5,000 mark. At 15,000, it is starting to become clear that some numbers are being produced and that you can make them out. At 40,000 epochs, the generator is able to do a few digits pretty well—notice that the 1 and the other digits still need additional refining. So, what happens if we train a GAN on only a single digit from the MNIST data?

Let's check out some of the results from my three generator:

The first thing you will notice is that the GAN model is able to converge much quicker with the three model. Ideally, the model is able to learn this image type very well. And, eventually at 40,000 epochs, the GAN is able to produce realistic looking 3 in almost all the example cases that were pulled for this particular graphic.

So, what does this tell us about training GANs on different datasets? Let's go over some high level points:

  • A more constrained space for the GAN to learn over will result in faster convergence
  • The digits dataset as a whole is more challenging due to the nature of digits and the fact that some of these digits look similar:
    • The model would need to train longer (and potentially with more examples) to learn an appropriate representation of the training data
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.14.240.252