How to do it...

The GAN network in this case is arguably the easiest part to implement—we're simply going to link up our networks so they can train together:

  1. Import all of the libraries we need to use for this class:
#!/usr/bin/env python3
import sys
import numpy as np
from keras.models import Sequential, Model
from keras.layers import Input
from keras.optimizers import Adam, SGD
from keras.utils import plot_model

  1. Implement the init class with the Adam optimizer and then an array of model_inputs and model_outputs:
class GAN(object):
def __init__(self, model_inputs=[],model_outputs=[]):
self.inputs = model_inputs
self.outputs = model_outputs
self.gan_model = Model(inputs = self.inputs, outputs =
self.outputs)
self.OPTIMIZER = Adam(lr=2e-4, beta_1=0.5)
self.gan_model.compile(loss=['mse', 'mae'],
loss_weights=[ 1, 100],
optimizer=self.OPTIMIZER)
self.save_model()
self.summary()

It's important to note that we need to have two separate loss functions because of the way the input is hooked into the network. In the training script, you'll see how we connect the generator and discriminator into this GAN network.

  1. Define a model to give us access outside of the class:
def model(self):
model = Model()
return model
  1. These are the normal helper functions from every one of our other chapters:
def summary(self):
return self.gan_model.summary()

def save_model(self):
plot_model(self.gan_model, to_file='/data/GAN_Model.png')
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.113.55