Comparison between sparsely distributed memory and Hopfield-type neural network models

The Sparsely Distributed Memory (SDM) model (Kanerva, 1984) is compared to Hopfield-type neural-network models. A mathematical framework for comparing the two is developed, and the capacity of each model is investigated. The capacity of the SDM can be increased independently of the dimension of the stored vectors, whereas the Hopfield capacity is limited to a fraction of this dimension. However, the total number of stored bits per matrix element is the same in the two models, as well as for extended models with higher order interactions. The models are also compared in their ability to store sequences of patterns. The SDM is extended to include time delays so that contextual information can be used to cover sequences. Finally, it is shown how a generalization of the SDM allows storage of correlated input pattern vectors.

[1]  Shun-ichi Amari,et al.  Characteristics of randomly connected threshold-element networks and network systems , 1971 .

[2]  Pentti Kanerva,et al.  Parallel structures in human and computer memory , 1987 .

[3]  D Kleinfeld,et al.  Sequential state generation by model neural networks. , 1986, Proceedings of the National Academy of Sciences of the United States of America.

[4]  Kanter,et al.  Temporal association in asymmetric neural networks. , 1986, Physical review letters.

[5]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[7]  P. J. Denning A view of Kanerva's sparse distributed memory , 1986 .

[8]  J. Albus A Theory of Cerebellar Function , 1971 .

[9]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[10]  Baldi,et al.  Number of stable points for spin-glasses and neural networks of higher orders. , 1987, Physical review letters.

[11]  Sompolinsky,et al.  Storing infinite numbers of patterns in a spin-glass model of neural networks. , 1985, Physical review letters.

[12]  J. J. Hopfield,et al.  “Neural” computation of decisions in optimization problems , 1985, Biological Cybernetics.

[13]  J Nagumo,et al.  [A model of associative memory]. , 1972, Iyo denshi to seitai kogaku. Japanese journal of medical electronics and biological engineering.

[14]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[15]  H. C. LONGUET-HIGGINS,et al.  Non-Holographic Associative Memory , 1969, Nature.

[16]  P. Gilbert A theory of memory that explains the function and structure of the cerebellum. , 1974, Brain research.

[17]  S. Grossberg,et al.  Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors , 1976, Biological Cybernetics.

[18]  Stephen Grossberg,et al.  Embedding Fields: Underlying Philosophy, Mathematics, and Applications to Psychology, Physiology, and Anatomy , 1971 .

[19]  Teuvo Kohonen,et al.  Content-addressable memories , 1980 .

[20]  Shun-ichi Amari,et al.  A method of statistical neurodynamics , 1974, Kybernetik.

[21]  C. L. Giles,et al.  Machine learning using higher order correlation networks , 1986 .

[22]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.