Hopfield recurrent neural networks

In 1982, physicist John J. Hopfield published a fundamental article in which a mathematical model commonly known as the Hopfield network was introduced. This network highlighted new computational capabilities deriving from the collective behavior of a large number of simple processing elements. A Hopfield Network is a form of recurrent ANN.

According to Hopfield every physical system can be considered as a potential memory device if it has a certain number of stable states, which act as an attractor for the system itself. On the basis of this consideration, he formulated the thesis that the stability and placement of such attractors represented spontaneous properties of systems consisting of considerable quantities of mutually interacting neurons.

Structurally, the Hopfield network constitutes a recurrent symmetrical neural network (therefore with a synaptic weights matrix that is symmetric), one that is completely connected and in which each neuron is connected to all the others, as shown in the following figure:

As already mentioned before, a recurrent network is a neural model in which a flow of bidirectional information is present; in other words, while in feedforward networks the propagation of the signals takes place only in a continuous manner in the direction that leads from the inputs to the outputs in the recurrent networks this propagation can also occur from a neural layer following a previous one or between neurons belonging to at the same layer (Hopfield network) and even between a neuron and itself.

The dynamics of a Hopfield network is described by a nonlinear system of differential equations and the neuron update mechanism can be:

  • Asynchronous: One neuron is updated at a time
  • Synchronous: All neurons are updated at the same time
  • Continuous: All the neurons are continually updated
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.117.145.173