Model definition of the generator

The model is the heart of each of these classes. In this case, we are defining a model that is going to take a sample from the latent space as an input and use it to produce an image with the same shape as the original image. Let's break down this model code to understand how this is happening:

  1. First, let's define the model and begin with the basic Sequential structure:
def model(self, block_starting_size=128,num_blocks=4):
model = Sequential()
  1. Next, we start with our first block of layers in the neural network:
block_size = block_starting_size 
model.add(Dense(block_size, input_shape=(self.LATENT_SPACE_SIZE,)))
model.add(LeakyReLU(alpha=0.2))
model.add(BatchNormalization(momentum=0.8))

This adds a dense layer to the network with an input shape that is a latent sample and a starting size of our initial block size. In this case, we are starting with 128 neurons. Using the LeakyReLU activation layer, we are able to avoid vanishing gradients and non-activated neurons. Then, BatchNormalization cleans up the layer by normalizing the activations based on the previous layer. This improves the efficiency of the network.

  1. Next, we have the trickiest part:
for i in range(num_blocks-1):
block_size = block_size * 2
model.add(Dense(block_size))
model.add(LeakyReLU(alpha=0.2))
model.add(BatchNormalization(momentum=0.8))

This set of code allows us to add additional blocks like the previous one, but doubles the dense layer size. I'd encourage you to experiment with different numbers of blocks. What are the outcomes? Do you see increased performance? Faster convergence? Divergence? This set of code should allow you to experiment with this type of architecture in a more flexible way.

  1. The last piece to this method involves restructuring the output to be the same shape as the input image and return the model:
model.add(Dense(self.W * self.H * self.C, activation='tanh'))
model.add(Reshape((self.W, self.H, self.C)))

return model
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.151.107