Iterative retrieval of sparsely coded associative memory patterns

Abstract We investigate the pattern completion performance of neural auto-associative memories composed of binary threshold neurons for sparsely coded binary memory patterns. By focussing on iterative retrieval, we are able to introduce effective threshold control strategies. These are investigated by means of computer simulation experiments and analytical treatment. To evaluate the systems performance we consider the completion capacity C and the mean retrieval errors. The asymptotic completion capacity values for the recall of sparsely coded binary patterns in one-step retrieval is known to be ln 2 4 ≈ 17.32% for binary Hebbian learning, and 1 (8 ln 2) ≈ 18% for additive Hebbian learning. These values are accomplished with vanishing error probability and yet are higher than those obtained in other known neural memory models. Recent investigations on binary Hebbian learning have proved that iterative retrieval as a more refined retrieval method does not improve the asymptotic completion capacity of one step retrieval. In a finite size auto-associative memory we show that iterative retrieval achieves higher capacity and better error correction than one-step retrieval. One-step retrieval produces high retrieval errors at optimal memory load. Iterative retrieval reduces the retrieval errors within a few iteration steps (t ⩽ 5). Experiments with additive Hebbian learning show that in the finite model, binary Hebbian learning exhibits much better performance. Thus the main concern of this paper is binary Hebbian learning. We examine iterative retrieval in experiments with up to n = 20,000 threshold neurons. With this system size one-step retrieval yields a completion capacity of about 16%, the second retrieval step increases this value to 17.9% and with iterative retrieval we obtain 19%. The first two retrieval steps in the finite system have also been treated analytically. For one-step retrieval the asymptotic capacity value is approximated from below with growing system size. In the second retrieval step (and as the experiments suggest also for iterative retrieval) the finite size behaviour is different. The capacity exceeds the asymptotic value, reaches an optimum for finite system size, and decreases to the asymptotic limit.

[1]  H. C. LONGUET-HIGGINS,et al.  Non-Holographic Associative Memory , 1969, Nature.

[2]  Daniel J. Amit,et al.  Modeling brain function: the world of attractor neural networks, 1st Edition , 1989 .

[3]  Friedrich T. Sommer Theorie neuronaler Assoziativspeicher: lokales Lernen und iteratives Retrieval von Information , 1994 .

[4]  A. R. Gardner-Medwin The recall of events through the learning of associations between their parts , 1976, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[5]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[6]  Günther Palm,et al.  Information capacity in recurrent McCulloch-Pitts networks with sparsely coded memory states , 1992 .

[7]  Jay Buckinghamts On setting unit thresholds in an incompletely connected associative net , 1993 .

[8]  BART KOSKO,et al.  Bidirectional associative memories , 1988, IEEE Trans. Syst. Man Cybern..

[9]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[10]  John Robinson,et al.  Statistical analysis of the dynamics of a sparse associative memory , 1992, Neural Networks.

[11]  David Willshaw,et al.  On setting unit thresholds in an incompletely connected associative net Network: Comput , 1993 .

[12]  W. Little The existence of persistent states in the brain , 1974 .

[13]  Shun-ichi Amari,et al.  Characteristics of sparsely encoded associative memory , 1989, Neural Networks.

[14]  D Marr,et al.  Simple memory: a theory for archicortex. , 1971, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.