Statistical analysis of the dynamics of a sparse associative memory

Abstract A theoretical treatment of the dynamics of a recurrent autoassociative network is given. The network consists of randomly connected excitatory neurons, together with an inhibitory interneuron that sets their thresholds. Both the degree of connectivity between the neurons and the level of firing in the stored memories can be set arbitrarily. The memories are stored via a two-valued Hebbian, and evolution from an arbitrary initial state is by discrete, synchronous steps. The theory takes into account both spatial correlations between the learned connection strengths and temporal correlations between the state of the system and these connection strengths. Good qualitative and quantitative agreement with computer simulations is obtained for both the intermediate states and the final equilibrium state. Recall is studied as a function of initial state and of threshold parameters. The capacity of the network is investigated both numerically and analytically: there is a large increase in capacity as the level of firing decreases and also as the connectivity increases.

[1]  Shun-ichi Amari,et al.  Statistical neurodynamics of associative memory , 1988, Neural Networks.

[2]  Sompolinsky,et al.  Information storage in neural networks with low levels of activity. , 1987, Physical review. A, General physics.

[3]  W A Little,et al.  A statistical theory of short and long term memory. , 1975, Behavioral biology.

[4]  B. McNaughton,et al.  Hippocampal synaptic enhancement and information storage within a distributed memory system , 1987, Trends in Neurosciences.

[5]  D J Willshaw,et al.  An assessment of Marr's theory of the hippocampus as a temporary memory store. , 1990, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[6]  D Marr,et al.  Simple memory: a theory for archicortex. , 1971, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[7]  Shun-ichi Amari Associative Memory and Its Statistical Neurodynamical Analysis , 1988 .

[8]  Günther Palm,et al.  Local Synaptic Rules with Maximal Information Storage Capacity , 1988 .

[9]  B. McNaughton,et al.  Hebb-Marr networks and the neurobiological representation of action in space. , 1990 .

[10]  W. Little The existence of persistent states in the brain , 1974 .

[11]  William G. Faris,et al.  Probabilistic analysis of a learning matrix , 1988, Advances in Applied Probability.

[12]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[13]  Daniel J. Amit,et al.  Modeling brain function: the world of attractor neural networks, 1st Edition , 1989 .

[14]  A. R. Gardner-Medwin The recall of events through the learning of associations between their parts , 1976, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[15]  James A. Anderson,et al.  Neurocomputing: Foundations of Research , 1988 .

[16]  Shun-ichi Amari,et al.  Characteristics of sparsely encoded associative memory , 1989, Neural Networks.

[17]  Sompolinsky,et al.  Spin-glass models of neural networks. , 1985, Physical review. A, General physics.

[18]  A. R. Gardner-Medwin Doubly modifiable synapses: a model of short and long term auto-associative memory , 1989, Proceedings of the Royal Society of London. B. Biological Sciences.