Word-embeddings in Go 

Word-embeddings in Go is an example of a task-specific ML library. It implements the two-layer neural network necessary to generate word embeddings, using Word2vec and GloVe. It is a great implementation, fast, and clean. It implements a limited number of features very well and in ways specific to the task of generating word embeddings via Word2vec and GloVe.

An example of this is a core feature required for training DNNs, an optimization method called SGD. This is used in the GloVe model, developed by a team at Stanford. However, the code is integrated specifically with the GloVe model, and the additional optimization methods used in Word2Vec (negative sampling and skip-gram) are not useful with DNNs.

This can be useful for DL tasks, say, for generating an embedded layer or dense vector representation of a text corpus that can be used in Long Short-Term Memory (LSTM) networks, which we will cover in Chapter 5Next Word Prediction with Recurrent Neural Networks. However, all of the advanced functions we would need (for example, gradient descent or backpropagation) and model features (LSTM units themselves) are absent.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.211.106