Loss interpretability

One of the problems while training GANs is that the values of both the generator and discriminator loss do not have a noticeable effect. It's not like training a classifier that you just wait for the loss to drop to see whether your model is training.

With GANs, the loss values' dropping does not necessarily mean that the model is training:

From many people's experiments and research, here are some tips on how to use the GAN loss values:

  • You don't want that the discriminator loss to go down really fast, because it will not be able to provide feedback to the generator to improve it.
  • If the generator loss falls quickly, it means that it found a single discriminator weakness and is exploiting this weakness again and again. If this happens, it is called mode collapse.

The loss is really only good for seeing if something has gone wrong in training. Consequently, there is no good way of knowing that training has converged. Normally, what is best to do is to keep looking at the output of generator. Ensure that the outputs are looking close to what you expect and that there is good variety in them.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.101.81