Restricted Boltzmann machine

A restricted Boltzman machine (RBM) is an undirected neural network, also denoted as generative stochastic networks (GSNs), and can learn probability distribution over its set of inputs. As the name suggests, they originate from the Boltzman machine, a recurrent neural network that was introduced in the eighties. In a Boltzmann machine, every node or neuron is connected with all other nodes, which makes it difficult to process when the node count increases. Restricted means that the neurons must form two fully connected layers, an input layer and a hidden layer, as shown in the following diagram:

Unlike feedforward networks, the connections between the visible and hidden layers are undirected, hence the values can be propagated in both visible-to-hidden and hidden-to-visible directions.

Training RBMs is based on the contrastive divergence algorithm, which uses a gradient descent procedure, similar to backpropagation, to update weights, and Gibbs sampling is applied on the Markov chain to estimate the gradient, that is the direction on how to change the weights.

RBMs can also be stacked to create a class known as deep belief networks (DBNs). In this case, the hidden layer of an RBM acts as a visible layer for the RBM layer, as shown in the following diagram:

The training, in this case, is incremental: training layer by layer.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.226.34.25