A working example - MNIST

The working example is based on the Github page: https://github.com/ywpkwon/siamese_tf_mnist. This code here uses Siamese Networks to embed hand-written MNIST digits into 2D space, digits belonging to same class are embedded together. The code consists of three major files:

run.py: It contains the basic wrapper to perform training. It uses the Gradient Descent algorithm to minimize the contrastive loss.

inference.py: This contains the Siamese Class which defines a 3-layered fully connected network. The similarity between the output of two networks in the code is Euclidean. The partial generative loss and the partial imposter Loss are used to then to calculate the Contrastive loss.

visualize.py: This is again just a wrapper to visualize the results.

After first 100,000 training steps, the results are:

You can see that same (labeled) digits embedded together in the 2D space.

There is another interesting example at https://github.com/dhwajraj/deep-siamese-text-similarity.

Here, using Tensorflow, deep Siamese LSTM networks are trained to capture phrase/sentence similarity using character embeddings.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.137.167.107