Helper functions

But wait, there's more! We've got a few helper functions to go over that we've been using throughout this class:

  1. First, we have a convenience function called sample_latent_space:
def sample_latent_space(self, instances):
return np.random.normal(0, 1,
(instances,self.LATENT_SPACE_SIZE))

This function is essentially wrapping a call to numpy in an easy to use method call.

  1. Next is the code to plot the model checkpoint—this function is going to print a graphic that shows random samples of the generator output. Let's briefly cover the core pieces in our plotting function:
    1. Define a method for plotting checkpoint images—take a numeric value, e, as input:

def plot_checkpoint(self,e):
filename = "/data/sample_"+str(e)+".png"
    1. Create noise from the latent space, and then generate an image with our generator:
    noise =     self.sample_latent_space(16)
images = self.generator.Generator.predict(noise)
  1. Plot these newly generated images—in this case, we produced 16 images at each epoch checkpoint:
            plt.figure(figsize=(10,10))
for i in range(images.shape[0]):
plt.subplot(4, 4, i+1)
image = images[i, :, :, :]
image = np.reshape(image, [self.H,self.W])
plt.imshow(image, cmap='gray')
plt.axis('off')
  1. Finally, plot, save the figure, and close the figure:
            plt.tight_layout()
plt.savefig(filename)
plt.close('all')
return

The key part to notice here is that we still don't have any established metrics for the goodness of our model outputs. The first thing to do was to ensure that we can train the model and the loss converges to a minima (fully trained). In future chapters, we are going to discuss metrics for evaluating the goodness of the generator output.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.189.178.237