Associative memories with short-range, higher order couplings

A study of recurrent associative memories with exclusively short-range connections is presented. To increase the capacity, higher order couplings are used. We study capacity and pattern completion ability of networks consisting of units with binary (±1) output. Results show that perfect learning of random patterns is difficult for very short coupling ranges, and that the average expected capacities (allowing small errors) in these cases are much smaller than the theoretical maximum, 2 bits per coupling. However, it is also shown that by choosing ranges longer than certain limit sizes, depending on network size and order, we can come close to the theoretical capacity limit. We indicate that these limit sizes increase very slowly with net size. Thus, couplings to at least 28 and 36 neighbors suffice for second order networks with 400 and 90,000 units, respectively. From simulations it is found that even networks with coupling ranges below the limit size are able to complete input patterns with more than 10% errors. Especially remarkable is the ability to correct inputs with large local errors (part of the pattern is masked). We present a local learning algorithm for heteroassociation in recurrent networks without hidden units. The algorithm is used in a multinet system to improve pattern completion ability on correlated patterns.

[1]  E. Gardner Multiconnected neural network models , 1987 .

[2]  Werner Reichardt,et al.  Theoretical approaches in neurobiology , 1981 .

[3]  Baldi,et al.  Number of stable points for spin-glasses and neural networks of higher orders. , 1987, Physical review letters.

[4]  Michael Biehl,et al.  The AdaTron: An Adaptive Perceptron Algorithm , 1989 .

[5]  D. Amit,et al.  Statistical mechanics of neural networks near saturation , 1987 .

[6]  E. Gardner The space of interactions in neural network models , 1988 .

[7]  C. L. Giles,et al.  Machine learning using higher order correlation networks , 1986 .

[8]  Demetri Psaltis,et al.  Nonlinear discriminant functions and associative memories , 1987 .

[9]  W. Krauth,et al.  Learning algorithms with optimal stability in neural networks , 1987 .

[10]  Haim Sompolinsky,et al.  The Theory of Neural Networks: The Hebb Rule and Beyond , 1987 .

[11]  R. Palmer,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[12]  E. Gardner,et al.  An Exactly Solvable Asymmetric Neural Network Model , 1987 .

[13]  G. A. Kohring,et al.  Neural networks with many-neuron interactions , 1990 .

[14]  C. Lee Giles,et al.  Higher Order Recurrent Networks and Grammatical Inference , 1989, NIPS.

[15]  Andrew Canning,et al.  Partially connected models of neural networks , 1988 .

[16]  Thomas B. Kepler,et al.  Optimal learning in neural network memories , 1989 .

[17]  T. Sejnowski Higher‐order Boltzmann machines , 1987 .

[18]  Kanter,et al.  Associative recall of memory without errors. , 1987, Physical review. A, General physics.

[19]  Thomas M. Cover,et al.  Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..

[20]  Isabelle Guyon,et al.  High-order neural networks: information storage without errors , 1987 .

[21]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .