Initialization variables (init in the Discriminator class)

When creating a class in any object oriented programming, selecting what variables and quantities are initialized in init is an important step. In this case, we need to know the capacity of the model, the shape of the input, initialize the optimizer, and build the model, as follows:

  1. Class initialization with width, height, channels, and latent space size:
class Discriminator(object):
def __init__(self, width = 28, height= 28, channels = 1,
latent_size=100):
  1. Add the input arguments as internal variables to the class:
self.CAPACITY = width*height*channels
self.SHAPE = (width,height,channels)
self.OPTIMIZER = Adam(lr=0.0002, decay=8e-9)
  1. Initialize the model based on the method we will define later in this recipe:
self.Discriminator = self.model()

  1. Compile a model with a binary_crossentropy loss and our specified optimizer:
self.Discriminator.compile(loss='binary_crossentropy', 
optimizer=self.OPTIMIZER, metrics=['accuracy'] )
  1. Display a text summary of the model on the Terminal:
self.Discriminator.summary()

In this case, we call the model method and compile the model using binary_crossentropy. During the training step to update the weights, the optimizer will be the Adam optimizer. One of the key points I'd like to highlight here is that we made a big deal about the loss function for GANs. Why are we only using a built-in loss function here? Simply put—this is as an example GAN, and we'll have many more opportunities to implement custom loss functions. The modern GAN structures actually rely on custom loss functions to achieve higher accuracy. In this case, the key is to ensure that we are able to build the basic structure and get it to train. After that, we can focus on building custom functionality.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.149.240.196