Artificial memories: Capacity, basis rate and inference

Abstract We study associative and storage memories for memory traces of size N and aim to establish that both the size of the system (as measured by, e.g., the number of nodes in a network) can be of order N and the number of traces consistent with reliable operation can be exponentially large in N, so that a positive capacity (in bits per node) can be achieved. It is well known that, if the traces are generated as M random vectors, then reliability imposes a linear bound on M, in that it implies an upper bound on the asymptotic (large N) value of α = M/N. For the noise-free Hopfield net this critical bound is about 0.138. We show that, if superposition of traces is allowed, so that the M given traces constitute the random basis of a linear code, then exponential memory size and a positive capacity can be achieved. However, there is still a critical upper bound on the basis rate α = M/N, implied now, not by the condition of reliability, but by the necessity that the recursion realising the calculation should be stable. For our model we determine this critical value exactly as α c = 3 − √8 ≐ 0.172. Our model is based upon inference concepts and differs in slight but important respects from the Hopfield model. We do not use replica methods, but appeal to a generalised version of the Wigner semi-circle theorem on the asymptotic distribution of eigenvalues.

[1]  Charles M. Marcus,et al.  Nonlinear dynamics of analog associative memories , 1993 .

[2]  P. Whittle Neural Nets and Implicit Inference , 1991 .

[3]  Teuvo Kohonen,et al.  Correlation Matrix Memories , 1972, IEEE Transactions on Computers.

[4]  Shun-ichi Amari,et al.  Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements , 1972, IEEE Transactions on Computers.

[5]  Rodney M. Goodman,et al.  Recurrent correlation associative memories , 1991, IEEE Trans. Neural Networks.

[6]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[7]  M. Opper Learning in Neural Networks: Solvable Dynamics , 1989 .

[8]  Sompolinsky,et al.  Storing infinite numbers of patterns in a spin-glass model of neural networks. , 1985, Physical review letters.

[9]  J Nagumo,et al.  [A model of associative memory]. , 1972, Iyo denshi to seitai kogaku. Japanese journal of medical electronics and biological engineering.

[10]  Gene H. Golub,et al.  Matrix computations , 1983 .

[11]  P. Whittle The antiphon. II. The exact evaluation of memory capacity , 1990, Proceedings of the Royal Society of London. A. Mathematical and Physical Sciences.

[12]  Ruth J. Williams,et al.  A multiclass closed queueing network with unconventional heavy traffic behavior , 1996 .

[13]  Mohamad H. Hassoun,et al.  Associative neural memories , 1993 .

[14]  Shun-ichi Amari,et al.  Mathematical foundations of neurocomputing , 1990, Proc. IEEE.

[15]  A. Krogh Learning with noise in a linear perceptron , 1992 .

[16]  C. Mallows,et al.  Valency enumeration of rooted plane trees , 1972 .

[17]  J. Hertz,et al.  Phase transitions in simple learning , 1989 .

[18]  K. Wachter The Limiting Empirical Measure of Multiple Discriminant Ratios , 1980 .

[19]  Opper,et al.  Learning of correlated patterns in spin-glass networks by local learning rules. , 1987, Physical review letters.

[20]  K. Wachter The Strong Limits of Random Matrix Spectra for Sample Matrices of Independent Elements , 1978 .

[21]  J. von Neumann,et al.  Probabilistic Logic and the Synthesis of Reliable Organisms from Unreliable Components , 1956 .

[22]  Masato Okada,et al.  A hierarchy of macrodynamical equations for associative memory , 1995, Neural Networks.

[23]  James A. Anderson,et al.  A simple neural network generating an interactive memory , 1972 .

[24]  Peter Sollich Learning in large linear perceptrons and why the thermodynamic limit is relevant to the real world , 1994, NIPS.

[25]  Waugh,et al.  Reducing neuron gain to eliminate fixed-point attractors in an analog associative memory. , 1991, Physical review. A, Atomic, molecular, and optical physics.

[26]  Shun-ichi Amari,et al.  Statistical neurodynamics of associative memory , 1988, Neural Networks.