Putting it all together

As before, we will show the entire neural network structure here. Note that this structure is for the version of the model that includes GloVe vectors:

def build_model(vocab_size, embedding_dim, sequence_length, embedding_matrix):

sequence_input = Input(shape=(sequence_length,), dtype='int32')
embedding_layer = Embedding(input_dim=vocab_size,
output_dim=embedding_dim,
weights=[embedding_matrix],
input_length=sequence_length,
trainable=False,
name="embedding")(sequence_input)
x = Conv1D(128, 5, activation='relu')(embedding_layer)
x = MaxPooling1D(5)(x)
x = Conv1D(128, 5, activation='relu')(x)
x = MaxPooling1D(5)(x)
x = Conv1D(128, 5, activation='relu')(x)
x = GlobalMaxPooling1D()(x)
x = Dense(128, activation='relu')(x)
preds = Dense(20, activation='softmax')(x)
model = Model(sequence_input, preds)
model.compile(loss='categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
return model

I'm using adam, categorical_crossentropy, and accuracy here again. While there are many new topics presented in this chapter, hopefully it's somewhat comforting to see what remains constant.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.224.54.168