Memory Dynamics in Attractor Networks

As can be represented by neurons and their synaptic connections, attractor networks are widely believed to underlie biological memory systems and have been used extensively in recent years to model the storage and retrieval process of memory. In this paper, we propose a new energy function, which is nonnegative and attains zero values only at the desired memory patterns. An attractor network is designed based on the proposed energy function. It is shown that the desired memory patterns are stored as the stable equilibrium points of the attractor network. To retrieve a memory pattern, an initial stimulus input is presented to the network, and its states converge to one of stable equilibrium points. Consequently, the existence of the spurious points, that is, local maxima, saddle points, or other local minima which are undesired memory patterns, can be avoided. The simulation results show the effectiveness of the proposed method.

[1]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[2]  Walter Senn,et al.  Learning Only When Necessary: Better Memories of Correlated Patterns in Networks with Bounded Synapses , 2005, Neural Computation.

[3]  Luping Shi,et al.  Artificial cognitive memory—changing from density driven to functionality driven , 2011 .

[4]  Sommers,et al.  Chaos in random neural networks. , 1988, Physical review letters.

[5]  Wei Xing Zheng,et al.  Identification of a Class of Nonlinear Autoregressive Models With Exogenous Inputs Based on Kernel Machines , 2011, IEEE Transactions on Signal Processing.

[6]  C. Stark,et al.  Pattern Separation in the Human Hippocampal CA3 and Dentate Gyrus , 2008, Science.

[7]  Sompolinsky,et al.  Spin-glass models of neural networks. , 1985, Physical review. A, General physics.

[8]  E. Save,et al.  Attractors in Memory , 2005, Science.

[9]  Y. Miyashita Neuronal correlate of visual associative long-term memory in the primate temporal cortex , 1988, Nature.

[10]  M. Hurley Lyapunov functions and attractors in arbitrary metric spaces , 1998 .

[11]  Misha Tsodyks Attractor Neural Networks and Spatial Maps in Hippocampus , 2005, Neuron.

[12]  Luping Shi,et al.  Behind the magical numbers: Hierarchical Chunking and the Human Working Memory Capacity , 2013, Int. J. Neural Syst..

[13]  Richard S. Zemel,et al.  Localist Attractor Networks , 2001, Neural Computation.

[14]  Neil Burgess,et al.  Attractor Dynamics in the Hippocampal Representation of the Local Environment , 2005, Science.

[15]  Sébastien Hélie,et al.  Energy minimization in the nonlinear dynamic recurrent associative memory , 2008, Neural Networks.

[16]  Haizhou Li,et al.  Memory Dynamics in Attractor Networks with Saliency Weights , 2010, Neural Computation.

[17]  Daniel J. Amit,et al.  Spike-Driven Synaptic Dynamics Generating Working Memory States , 2003, Neural Computation.

[18]  Anthony V. Robins,et al.  A robust method for distinguishing between learned and spurious attractors , 2004, Neural Networks.

[19]  Christof Koch,et al.  The role of single neurons in information processing , 2000, Nature Neuroscience.

[20]  Chris Eliasmith,et al.  A Controlled Attractor Network Model of Path Integration in the Rat , 2005, Journal of Computational Neuroscience.

[21]  Jacek M. Zurada,et al.  An energy function-based design method for discrete hopfield associative memory with attractive fixed points , 2005, IEEE Transactions on Neural Networks.