Selecting layers

Once the model is defined, model = CAE_CNN_Encoder(), it is important to select layers that will be initialized with pretrained weights. Pay attention that the structure of both networks, the one to be initialized and the one that gives the trained weights, must be the same. So, for example, the following snippet of code will select all layers with name convs of fc:

from models import CAE_CNN_Encoder
model = CAE_CNN_Encoder()

list_convs = [v for v in tf.global_variables() if "conv" in v.name]
list_fc_linear = [v for v in tf.global_variables() if "fc" in v.name or "output" in v.name]

Note that those lists are populated from tf.global_variables(); if we choose to print its content, we might observe that it holds all the model variables as shown:

[<tf.Variable 'conv1/kernel:0' shape=(5, 5, 1, 16) dtype=float32_ref>,
 <tf.Variable 'conv1/bias:0' shape=(16,) dtype=float32_ref>,
 <tf.Variable 'conv2/kernel:0' shape=(5, 5, 16, 32) dtype=float32_ref>,
 <tf.Variable 'conv2/bias:0' shape=(32,) dtype=float32_ref>,
 <tf.Variable 'fc1/kernel:0' shape=(1568, 200) dtype=float32_ref>,
 <tf.Variable 'fc1/bias:0' shape=(200,) dtype=float32_ref>,
 <tf.Variable 'logits/kernel:0' shape=(200, 10) dtype=float32_ref>,
 <tf.Variable 'logits/bias:0' shape=(10,) dtype=float32_ref>]

Once the layers of the defined graph are grouped into two lists, convolutional and fully connected, you will use tf.Train.Saver to load the weights that you prefer. First we need to create a saver object, giving as input the list of variables that we want to load from a checkpoint as follows:

# Define the saver object to load only the conv variables
saver_load_autoencoder = tf.train.Saver(var_list=list_convs)

In addition to saver_load_autoencoder we need to create another saver object that will allow us to store all the variables of the network to be trained into checkpoints.

# Define saver object to save all the variables during training
saver = tf.train.Saver()

Then, after the graph is initialized with init = tf.global_variables_initializer() and a session is created, we can use saver_load_autoencoder to restore the convolutional layers from a checkpoint as follows:

# Restore only the weights (From AutoEncoder)
saver_load_autoencoder.restore(sess, "..
/tmp/cae_cnn/model.ckpt-34")

Note that calling restore overrides the global_variables_initializer an all the selected weights are replaced by the ones from the checkpoint.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.146.35.72