How to do it...

We proceed with the recipe as follows:

  1. Define an attention mechanism by using the library tf.contrib.seq2seq.LuongAttention, which implements the attention model defined in Effective Approaches to Attention-based Neural Machine Translation by Minh-Thang Luong, Hieu Pham, and Christopher D. Manning (2015):
# attention_states: [batch_size, max_time, num_units]
attention_states = tf.transpose(encoder_outputs, [1, 0, 2])

# Create an attention mechanism
attention_mechanism = tf.contrib.seq2seq.LuongAttention(
num_units, attention_states,
memory_sequence_length=source_sequence_length)
  1. Use the defined attention mechanism as a wrapper around the decoder cell by means of an attention wrapper:
decoder_cell = tf.contrib.seq2seq.AttentionWrapper(
decoder_cell, attention_mechanism,
attention_layer_size=num_units)
  1. Run the code to see the results. We immediately notice that the attention mechanism produces a significant improvement in terms of the BLEU score:
python -m nmt.nmt 
> --attention=scaled_luong
> --src=vi --tgt=en
> --vocab_prefix=/tmp/nmt_data/vocab
> --train_prefix=/tmp/nmt_data/train
> --dev_prefix=/tmp/nmt_data/tst2012
> --test_prefix=/tmp/nmt_data/tst2013
> --out_dir=/tmp/nmt_attention_model
> --num_train_steps=12000
> --steps_per_stats=100
> --num_layers=2
> --num_units=128
> --dropout=0.2
> --metrics=bleu
[...]
# Start step 0, lr 1, Fri Sep 22 22:49:12 2017
# Init train iterator, skipping 0 elements
global step 100 lr 1 step-time 1.71s wps 3.23K ppl 15193.44 bleu 0.00
[...]
# Final, step 12000 lr 0.98 step-time 1.67 wps 3.37K ppl 14.64, dev ppl 14.01, dev bleu 15.9, test ppl 12.58, test bleu 17.5, Sat Sep 23 04:35:42 2017
# Done training!, time 20790s, Sat Sep 23 04:35:42 2017.
# Start evaluating saved best models.
[..]
loaded infer model parameters from /tmp/nmt_attention_model/best_bleu/translate.ckpt-12000, time 0.06s
# 608
src: nhưng bạn biết điều gì không ?
ref: But you know what ?
nmt: But what do you know ?
[...]
# Best bleu, step 12000 step-time 1.67 wps 3.37K, dev ppl 14.01, dev bleu 15.9, test ppl 12.58, test bleu 17.5, Sat Sep 23 04:36:35 2017
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.116.19.147