Enhancement of Hopfield neural networks using stochastic noise processes

Hopfield neural networks (HNN) are a class of densely connected single layer nonlinear networks of perceptrons. The network's energy function is defined through a learning procedure so that its minima coincide with states from a predefined set. However, because of the network's nonlinearity a number of undesirable local energy minima emerge from the learning procedure. This has shown to significantly effect the network's performance. In this work we present a stochastic process enhanced bipolar HNN. Presence of the stochastic process in the network enables us to describe its evolution using the Markov chains theory. When faced with a fixed network topology, the desired final distribution of states can be reached by modulating the network's stochastic process. Guided by the desired final distribution, we propose a general L/sup 2/ norm error density function optimization criterion for enhancement of the Hopfield neural network performance. This criterion can also be viewed in terms of stability intervals associated with the desired and non-desired stable states of the network. Because of the complexity of the general criterion we relax the optimization to the set of non-desired states. We further formulate a stochastic process design based on the stability intervals, which satisfies the optimization criterion and results in enhanced performance of the network. Our experimental simulations confirm the predicted improvement in performance.

[1]  Demetri Psaltis,et al.  On Reliable Computation With Formal Neurons , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[3]  W. Pitts,et al.  A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.

[4]  J. Hopfield,et al.  Computing with neural circuits: a model. , 1986, Science.

[5]  A. A. Mullin,et al.  Principles of neurodynamics , 1962 .

[6]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[7]  D. O. Hebb,et al.  The organization of behavior , 1988 .

[8]  Y. Zhan,et al.  Bipolar optical neural network with adaptive threshold , 1992 .

[9]  Dan Schonfeld,et al.  On the hysteresis and robustness of Hopfield neural networks , 1993 .

[10]  Shun-ichi Amari,et al.  Statistical neurodynamics of associative memory , 1988, Neural Networks.

[11]  John J. Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities , 1999 .

[12]  F ROSENBLATT,et al.  The perceptron: a probabilistic model for information storage and organization in the brain. , 1958, Psychological review.

[13]  Bing J. Sheu,et al.  Modified Hopfield neural networks for retrieving the optimal solution , 1991, IEEE Trans. Neural Networks.

[14]  Lin Shiao-Lin,et al.  Annealing by Perturbing Synapses , 1992 .

[15]  Shun-ichi Amari,et al.  Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements , 1972, IEEE Transactions on Computers.

[16]  Akio Yamamoto,et al.  Scheduling for minimizing total actual flow time by neural networks , 1992 .