Building and compiling the networks

In this section, let's build the essential networks and prepare them for training. Perform the steps as follows:

  1. Start by defining the optimizer required for the training, as shown in the following code:
# Define the common optimizer
common_optimizer = Adam(0.0002, 0.5)

We will use the Adam optimizer with the learning_rate equal to 0.0002, and the beta_1 value equal to 0.5.

  1. Start by creating the discriminator networks, as shown in the following code:
discriminatorA = build_discriminator()
discriminatorB = build_discriminator()

As mentioned in the The Architecture of the discriminator network section, a CycleGAN has two discriminator networks.

  1. Next, compile the networks, as follows:
discriminatorA.compile(loss='mse', optimizer=common_optimizer, metrics=['accuracy'])
discriminatorB.compile(loss='mse', optimizer=common_optimizer, metrics=['accuracy'])

Use mse as the loss function and accuracy as the metric to compile the networks.

  1. Next, create the generator networks A (generatorAToB) and B (generatorBToA). The input to the generator network A is a real image (realA) from the dataset A, and the output will be a reconstructed image (fakeB). The input to the generator network B is a real image (realB) from the dataset B, and the output will be a reconstructed image (fakeA), as follows:
generatorAToB = build_generator()
generatorBToA = build_generator()

As mentioned in the The Architecture of a CycleGAN section, CycleGAN has two generator networks. generatorAToB will translate an image from domain A to domain B. Similarly, generatorBToA will translate an image from domain B to domain A.

We have now created two generator and two discriminator networks. In the next sub-section, we will create and compile an adversarial network.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.20.231