Inference and Results

This gets us to the exciting part of our Generative Language Model --creating custom content!  The inference step in deep learning is where we take a trained model and expose it to new data to make predictions or classifications or as in the current context of this project where we're looking for model outputs that are new sentences that will be our novel custom content.  Let's see what our Deep Learning model can do!

We will use the code below to store and load the checkpoints into a binary file which stores all the weights.

from keras.callbacks import ModelCheckpoint

filepath="weights-{epoch:02d}-{loss:.4f}.hdf5"
checkpoint = ModelCheckpoint(filepath, monitor='loss', verbose=1, save_best_only=True, mode='min')
callbacks_list = [checkpoint]

Now we will use the trained model and generate new text:

seed_text = 'i want to generate new text after this '
print (seed_text)

# load the network weights
filename = "weights-30-1.545.hdf5"
model.load_weights(filename)
model.compile(loss='categorical_crossentropy', optimizer='adam')

for temperature in [0.5]:
print('------ temperature:', temperature)
sys.stdout.write(seed_text)

# We generate 400 characters
for i in range(40):
sampled = np.zeros((1, maxlen, len(chars)))
for t, char in enumerate(seed_text):
sampled[0, t, char_indices[char]] = 1.

preds = model.predict(sampled, verbose=0)[0]
next_index = sample(preds, temperature)
next_char = chars[next_index]

seed_text += next_char
seed_text = seed_text[1:]

sys.stdout.write(next_char)
sys.stdout.flush()
print()

After the successful model training, we see these results at the 30th epoch:

--- Generating with seed:
the "good old time" to which it belongs, and as an expressio" ------ temperature: 0.2 the "good old time" to which it belongs, and as an expression of the sense of the stronger and subli
------ temperature: 0.5
and as an expression of the sense of the stronger and sublication of possess and more spirit and in
------ temperature: 1.0
e stronger and sublication of possess and more spirit and instinge, and it: he ventlumentles, no dif
------ temperature: 1.2
d more spirit and instinge, and it: he ventlumentles, no differific and does amongly domen--whete ac

We find that with low values for the temperature hyperparameter, the model is able to generate more practical and realistic words.  When we use higher temperatures, the generated text does become quite a bit more interesting, unusual, some might even say creative.  Sometimes the model will even invent new words that often sound vaguely credible.  So the idea of using low temperature is more reasonable for the business use-cases where you need to be realistic and higher temperature values can be used in the more creative and artistic use-cases.

The art in deep learning and generative linguistic models is the balance between the learned structure and randomness which makes the output interesting.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.113.55