Application of an evolution strategy to the Hopfield model of associative memory

We apply evolutionary computations to Hopfield's neural network model of associative memory. In the Hopfield model, an almost infinite number of combinations of synaptic weights gives a network an associative memory function. Furthermore, there is a trade-off between the storage capacity and the size of the basin of attraction. Therefore, the model can be thought of as a test suite of multi-modal and/or multi-objective function optimizations. As a preliminary stage, we investigate the basic behavior of an associative memory under simple evolutionary processes. In this paper, we present some experiments using an evolution strategy.

[1]  Akira Imada,et al.  Searching Real-Valued Synaptic Weights of Hopfield's Associative Memory Using Evolutionary Programming , 1997, Evolutionary Programming.

[2]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[3]  Santosh S. Venkatesh,et al.  Feature and memory-selective error correction in neural associative memory , 1993 .

[4]  J. A. Hertz,et al.  Irreversible spin glasses and neural networks , 1987 .

[5]  Jean-Didier Legat,et al.  Analog implementation of an associative memory: learning algorithm and VLSI constraints , 1993 .

[6]  E. Capaldi,et al.  The organization of behavior. , 1992, Journal of applied behavior analysis.

[7]  Xin Yao,et al.  A review of evolutionary artificial neural networks , 1993, Int. J. Intell. Syst..

[8]  J. D. Schaffer,et al.  Combinations of genetic algorithms and neural networks: a survey of the state of the art , 1992, [Proceedings] COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks.

[9]  Sompolinsky,et al.  Spin-glass models of neural networks. , 1985, Physical review. A, General physics.

[10]  Akira Imada,et al.  Mutually Connected Neural Network Can Learn Some Patterns by Means of GA , 1995 .

[11]  Sompolinsky,et al.  Storing infinite numbers of patterns in a spin-glass model of neural networks. , 1985, Physical review letters.

[12]  Akira Imada,et al.  Genetic Algorithm Enlarges the Capacity of Associative Memory , 1995, ICGA.

[13]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[14]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[15]  Rodney M. Goodman,et al.  Recurrent correlation associative memories , 1991, IEEE Trans. Neural Networks.

[16]  Teuvo Kohonen,et al.  Representation of Associated Data by Matrix Operators , 1973, IEEE Transactions on Computers.

[17]  David B. Fogel,et al.  CONTINUOUS EVOLUTIONARY PROGRAMMING: ANALYSIS AND EXPERIMENTS , 1995 .

[18]  Gilbert Syswerda,et al.  Uniform Crossover in Genetic Algorithms , 1989, ICGA.

[19]  János Komlós,et al.  Convergence results in an associative memory model , 1988, Neural Networks.

[20]  John H. Holland,et al.  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .

[21]  Tatsuya Uezu,et al.  The Phase Space of Interactions and the Hebb Rule in the Neural Network Models , 1996 .

[22]  Marc Mézard,et al.  The roles of stability and symmetry in the dynamics of neural networks , 1988 .

[23]  S. Kirkpatrick,et al.  Infinite-ranged models of spin-glasses , 1978 .