Analysis and synthesis techniques for Hopfield type synchronous discrete time neural networks with application to associative memory

A qualitative theory for synchronous discrete time Hopfield-type neural networks is established. The authors' objectives are accomplished in two phases. First, they address the analysis of the class of neural networks considered. Next, making use of these results, they develop a synthesis procedure for the class of neural networks considered. In the analysis section, techniques from the theory of large-scale interconnected dynamical systems are used to derive tests for the asymptotic stability of an equilibrium of the neural network. Estimates for the rate at which the trajectories of the network will converge from an initial condition to a final state are presented. In the synthesis section the stability tests from the analysis section are used as constraints to develop a design algorithm for associative memories. The algorithm presented guarantees that each desired memory will be stored as an equilibrium, and that each desired memory will be asymptotically stable. The applicability of these results is demonstrated by means of two specific examples. >

[1]  Edward Wilson Kimbark,et al.  Power System Stability , 1948 .

[2]  John J. Hopfield,et al.  Simple 'neural' optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit , 1986 .

[3]  Eric Goles,et al.  Lyapunov function for parallel neural networks , 1987 .

[4]  Wolfgang Porod,et al.  Analysis and synthesis of a class of neural networks: variable structure systems with infinite grain , 1989 .

[5]  Wolfgang Porod,et al.  Qualitative analysis and synthesis of a class of neural networks , 1988 .

[6]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[7]  D. B. Schwartz,et al.  Dynamics of microfabricated electronic neural networks , 1987 .

[8]  A.N. Michel,et al.  Associative memories via artificial neural networks , 1990, IEEE Control Systems Magazine.

[9]  Anthony N. Michel,et al.  A synthesis procedure for Hopfield's continuous-time associative memory , 1990 .

[10]  A. Michel On the status of stability of interconnected systems , 1983 .

[11]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[12]  A. Michel,et al.  Analysis and synthesis of a class of neural networks: linear systems operating on a closed hypercube , 1989 .

[13]  I. Guyon,et al.  Information storage and retrieval in spin-glass like neural networks , 1985 .

[14]  Fathi M. A. Salam A model of neural circuits for programmable VLSI implementation , 1989, IEEE International Symposium on Circuits and Systems,.

[15]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[16]  Fathi M. A. Salam New artificial neural net models: basic theory and characteristics , 1990, IEEE International Symposium on Circuits and Systems.

[17]  J. Farrell,et al.  Qualitative analysis of neural networks , 1989 .

[18]  Anthony N. Michel,et al.  A synthesis procedure for Hopfield's continuous time content addressable memory , 1989, IEEE International Symposium on Circuits and Systems,.

[19]  L. Personnaz,et al.  Collective computational properties of neural networks: New learning mechanisms. , 1986, Physical review. A, General physics.