A learning and forgetting algorithm in associative memories: results involving pseudo-inverses

The authors develop a design technique for associative memories with learning and forgetting abilities via artificial feedback neural networks. The method utilizes the theory of large-scale interconnected dynamical systems, instead of the usual energy methods. Networks synthesized by this design method are capable of learning new patterns as well as forgetting old patterns without recomputing the entire interconnection matrix. The method, in which the properties of pseudo-inverse matrices are used to iteratively solve systems of linear equations, provides significant improvements over the outer product method and the projection learning rule. Several specific examples are given to illustrate the strengths and weaknesses of the methodology. >

[1]  Wolfgang Porod,et al.  Analysis and synthesis of a class of neural networks: variable structure systems with infinite grain , 1989 .

[2]  W. Press,et al.  Numerical Recipes: The Art of Scientific Computing , 1987 .

[3]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[4]  Anthony N. Michel,et al.  Analysis and synthesis techniques for Hopfield type synchronous discrete time neural networks with application to associative memory , 1990 .

[5]  T. Greville,et al.  Some Applications of the Pseudoinverse of a Matrix , 1960 .

[6]  Anthony Kuh,et al.  Information capacity of associative memories , 1989, IEEE Trans. Inf. Theory.

[7]  J. Farrell,et al.  Qualitative analysis of neural networks , 1989 .

[8]  A.N. Michel,et al.  Analysis and synthesis of a class of discrete-time neural networks described on hypercubes , 1991, IEEE Trans. Neural Networks.

[9]  T. Greville The Pseudoinverse of a Rectangular or Singular Matrix and Its Application to the Solution of Systems of Linear Equations , 1959 .

[10]  A.N. Michel,et al.  Associative memories via artificial neural networks , 1990, IEEE Control Systems Magazine.

[11]  Yaser S. Abu-Mostafa,et al.  Information capacity of the Hopfield model , 1985, IEEE Trans. Inf. Theory.

[12]  A. Michel,et al.  Analysis and synthesis of a class of neural networks: linear systems operating on a closed hypercube , 1989 .

[13]  Emanuel Marom Associative memory neural networks with concatenated vectors and nonzero diagonal terms , 1990, Neural Networks.

[14]  Isabelle Guyon,et al.  A biologically constrained learning mechanism in networks of formal neurons , 1986 .

[15]  Anthony N. Michel,et al.  A synthesis procedure for Hopfield's continuous-time associative memory , 1990 .

[16]  I. Guyon,et al.  Information storage and retrieval in spin-glass like neural networks , 1985 .

[17]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[18]  Wolfgang Porod,et al.  Qualitative analysis and synthesis of a class of neural networks , 1988 .

[19]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[20]  L. Personnaz,et al.  Collective computational properties of neural networks: New learning mechanisms. , 1986, Physical review. A, General physics.

[21]  A. Michel,et al.  On stability preserving mappings , 1983 .

[22]  J. J. Hopfield,et al.  ‘Unlearning’ has a stabilizing effect in collective memories , 1983, Nature.