The encoder network

The encoder network is a convolutional neural network (CNN) that encodes an image (x) to a latent vector (z) or a latent vector representation. Let's start by implementing the encoder network in the Keras framework.

Perform the following steps to implement the encoder network:

  1. Start by creating an input layer:
input_layer = Input(shape=(64, 64, 3))
  1. Next, add the first convolution block, which contains a 2D convolution layer with an activation function with the following configurations:
    • Filters: 32
    • Kernel size: 5
    • Strides: 2
    • Padding: same
    • Activation: LeakyReLU with alpha equal to 0.2:
# 1st Convolutional Block
enc = Conv2D(filters=32, kernel_size=5, strides=2, padding='same')(input_layer)
enc = LeakyReLU(alpha=0.2)(enc)
  1. Next, add three more convolution blocks, each one of which contains a 2-D convolution layer followed by a batch normalization layer and an activation function, with the following configurations:
    • Filters: 64, 128, 256
    • Kernel size: 5, 5, 5
    • Strides: 2, 2, 2
    • Padding: same, same, same
    • Batch normalization: Each convolutional layer is followed by a batch normalization layer
    • Activations: LealyReLU, LeakyReLU, LeakyReLU with alpha equal to 0.2:
# 2nd Convolutional Block
enc = Conv2D(filters=64, kernel_size=5, strides=2, padding='same')(enc)
enc = BatchNormalization()(enc)
enc = LeakyReLU(alpha=0.2)(enc)

# 3rd Convolutional Block
enc = Conv2D(filters=128, kernel_size=5, strides=2, padding='same')(enc)
enc = BatchNormalization()(enc)
enc = LeakyReLU(alpha=0.2)(enc)

# 4th Convolutional Block
enc = Conv2D(filters=256, kernel_size=5, strides=2, padding='same')(enc)
enc = BatchNormalization()(enc)
enc = LeakyReLU(alpha=0.2)(enc)
  1. Next, flatten the output from the last convolution block, as follows:  
# Flatten layer
enc = Flatten()(enc)
Converting an n-dimensional tensor to a one-dimensional tensor (array) is called flattening
  1. Next, add a dense (fully-connected) layer followed by a batch normalization layer and an activation function, with the following configurations:
    • Units (nodes): 2,096
    • Batch normalization: Yes
    • Activation: LeakyReLU with alpha equal to 0.2:
# 1st Fully Connected Layer
enc = Dense(4096)(enc)
enc = BatchNormalization()(enc)
enc = LeakyReLU(alpha=0.2)(enc)

  1. Next, add the second dense (fully-connected) layer with the following configuration:
    • Units (nodes): 100
    • Activation: None:
# Second Fully Connected Layer
enc = Dense(100)(enc)
  1. Finally, create a Keras model and specify the inputs and outputs for the encoder network:
# Create a model
model = Model(inputs=[input_layer], outputs=[enc])

The entire code for the encoder network is shown here:

def build_encoder():
"""
Encoder Network
:return: Encoder model
"""
input_layer = Input(shape=(64, 64, 3))

# 1st Convolutional Block
enc = Conv2D(filters=32, kernel_size=5, strides=2, padding='same')(input_layer)
enc = LeakyReLU(alpha=0.2)(enc)

# 2nd Convolutional Block
enc = Conv2D(filters=64, kernel_size=5, strides=2, padding='same')(enc)
enc = BatchNormalization()(enc)
enc = LeakyReLU(alpha=0.2)(enc)

# 3rd Convolutional Block
enc = Conv2D(filters=128, kernel_size=5, strides=2, padding='same')(enc)
enc = BatchNormalization()(enc)
enc = LeakyReLU(alpha=0.2)(enc)

# 4th Convolutional Block
enc = Conv2D(filters=256, kernel_size=5, strides=2, padding='same')(enc)
enc = BatchNormalization()(enc)
enc = LeakyReLU(alpha=0.2)(enc)

# Flatten layer
enc = Flatten()(enc)

# 1st Fully Connected Layer
enc = Dense(4096)(enc)
enc = BatchNormalization()(enc)
enc = LeakyReLU(alpha=0.2)(enc)

# Second Fully Connected Layer
enc = Dense(100)(enc)

# Create a model
model = Model(inputs=[input_layer], outputs=[enc])
return model

We have now successfully created a Keras model for the encoder network. Next, create a Keras model for the generator network.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.2.240