Computing with Continuous Attractors: Stability and Online Aspects

Two issues concerning the application of continuous attractors in neural systems are investigated: the computational robustness of continuous attractors with respect to input noises and the implementation of Bayesian online decoding. In a perfect mathematical model for continuous attractors, decoding results for stimuli are highly sensitive to input noises, and this sensitivity is the inevitable consequence of the system's neutral stability. To overcome this shortcoming, we modify the conventional network model by including extra dynamical interactions between neurons. These interactions vary according to the biologically plausible Hebbian learning rule and have the computational role of memorizing and propagating stimulus information accumulated with time. As a result, the new network model responds to the history of external inputs over a period of time, and hence becomes insensitive to short-term fluctuations. Also, since dynamical interactions provide a mechanism to convey the prior knowledge of stimulus, that is, the information of the stimulus presented previously, the network effectively implements online Bayesian inference. This study also reveals some interesting behavior in neural population coding, such as the trade-off between decoding stability and the speed of tracking time-varying stimuli, and the relationship between neural tuning width and the tracking speed.

[1]  H. Sompolinsky,et al.  Chaos in Neuronal Networks with Balanced Excitatory and Inhibitory Activity , 1996, Science.

[2]  Tai Sing Lee,et al.  Hierarchical Bayesian inference in the visual cortex. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.

[3]  J. T. Massey,et al.  Cognitive spatial-motor processes , 2004, Experimental Brain Research.

[4]  Terrence J. Sejnowski,et al.  ASSOCIATIVE MEMORY AND HIPPOCAMPAL PLACE CELLS , 1995 .

[5]  A. Georgopoulos,et al.  Cognitive neurophysiology of the motor cortex. , 1993, Science.

[6]  Xiao-Jing Wang Synaptic reverberation underlying mnemonic persistent activity , 2001, Trends in Neurosciences.

[7]  Matthias Bethge,et al.  Optimal Short-Term Population Coding: When Fisher Information Fails , 2002, Neural Computation.

[8]  A P Georgopoulos,et al.  On the relations between the direction of two-dimensional arm movements and cell discharge in primate motor cortex , 1982, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[9]  Geoffrey E. Hinton,et al.  The Helmholtz Machine , 1995, Neural Computation.

[10]  D. Mumford On the computational architecture of the neocortex , 2004, Biological Cybernetics.

[11]  Daniel D. Lee,et al.  Stability of the Memory of Eye Position in a Recurrent Network of Conductance-Based Model Neurons , 2000, Neuron.

[12]  Peter E. Latham,et al.  Statistically Efficient Estimation Using Population Coding , 1998, Neural Computation.

[13]  H S Seung,et al.  How the brain keeps the eyes still. , 1996, Proceedings of the National Academy of Sciences of the United States of America.

[14]  Rajesh P. N. Rao,et al.  Probabilistic Models of the Brain: Perception and Neural Function , 2002 .

[15]  B L McNaughton,et al.  Dynamics of the hippocampal ensemble code for space. , 1993, Science.

[16]  Xiao-Jing Wang,et al.  A Model of Visuospatial Working Memory in Prefrontal Cortex: Recurrent Network and Cellular Bistability , 1998, Journal of Computational Neuroscience.

[17]  Nicolas Brunel,et al.  Mutual Information, Fisher Information, and Population Coding , 1998, Neural Computation.

[18]  A. P. Georgopoulos,et al.  Cognitive spatial-motor processes , 2004, Experimental Brain Research.

[19]  H. Sompolinsky,et al.  Theory of orientation tuning in visual cortex. , 1995, Proceedings of the National Academy of Sciences of the United States of America.

[20]  Si Wu,et al.  Sequential Bayesian Decoding with a Population of Neurons , 2003, Neural Computation.

[21]  Thomas P. Trappenberg Dynamic Cooperation and Competition in a Network of Spiking Neurons , 1998, ICONIP.

[22]  D Mumford,et al.  On the computational architecture of the neocortex. II. The role of cortico-cortical loops. , 1992, Biological cybernetics.

[23]  T. Degrisa,et al.  Rapid response of head direction cells to reorienting visual cues : a computational model , 2004 .

[24]  J. Hopfield Neurons withgraded response havecollective computational properties likethoseoftwo-state neurons , 1984 .

[25]  A. Pouget,et al.  Reading population codes: a neural implementation of ideal observers , 1999, Nature Neuroscience.

[26]  Thomas P. Trappenberg,et al.  Self-organizing continuous attractor networks and motor function , 2003, Neural Networks.

[27]  E. Rolls,et al.  Self-organizing continuous attractor networks and path integration: one-dimensional models of head direction cells. , 2002 .

[28]  Leandro Nunes de Castro,et al.  Recent Developments In Biologically Inspired Computing , 2004 .

[29]  Ranulfo Romo,et al.  Basic mechanisms for graded persistent activity: discrete attractors, continuous attractors, and dynamic representations , 2003, Current Opinion in Neurobiology.

[30]  J. Rubin,et al.  Multiple-spike waves in a one-dimensional integrate-and-fire neural network , 2004, Journal of mathematical biology.

[31]  B L McNaughton,et al.  Interpreting neuronal population activity by reconstruction: unified framework with application to hippocampal place cells. , 1998, Journal of neurophysiology.

[32]  A. Koulakov,et al.  Model for a robust neural integrator , 2002, Nature Neuroscience.

[33]  Si Wu,et al.  Population Coding with Correlation and an Unfaithful Model , 2001, Neural Computation.

[34]  S. Amari Dynamics of pattern formation in lateral-inhibition type neural fields , 1977, Biological Cybernetics.

[35]  Idan Segev,et al.  Methods in neuronal modeling: From synapses to networks , 1989 .

[36]  Thomas P. Trappenberg,et al.  Continuous Attractor Neural Networks , 2005 .

[37]  Peter E. Latham,et al.  Narrow Versus Wide Tuning Curves: What's Best for a Population Code? , 1999, Neural Computation.

[38]  Rajesh P. N. Rao Bayesian Computation in Recurrent Neural Circuits , 2004, Neural Computation.

[39]  H. Sompolinsky,et al.  13 Modeling Feature Selectivity in Local Cortical Circuits , 2022 .

[40]  E. Rolls,et al.  Self-organizing continuous attractor networks and path integration: one-dimensional models of head direction cells , 2002, Network.

[41]  Roland J. Baddeley,et al.  Information Theory and the Brain: Bibliography , 2000 .

[42]  G. Bi,et al.  Synaptic modification by correlated activity: Hebb's postulate revisited. , 2001, Annual review of neuroscience.

[43]  Alexandre Pouget,et al.  Probabilistic Interpretation of Population Codes , 1996, Neural Computation.

[44]  Terrence J. Sejnowski,et al.  Neuronal Tuning: To Sharpen or Broaden? , 1999, Neural Computation.

[45]  Boris S. Gutkin,et al.  Turning On and Off with Excitation: The Role of Spike-Timing Asynchrony and Synchrony in Sustained Neural Activity , 2001, Journal of Computational Neuroscience.

[46]  Henry Markram,et al.  The "Liquid Computer": A Novel Strategy for Real-Time Computing on Time Series , 2002 .

[47]  Peter J. B. Hancock,et al.  Information Theory and the Brain , 2008 .

[48]  Si Wu,et al.  Population Coding and Decoding in a Neural Field: A Computational Study , 2002, Neural Computation.

[49]  J. Taube Head direction cells and the neurophysiological basis for a sense of direction , 1998, Progress in Neurobiology.

[50]  D C Van Essen,et al.  Functional properties of neurons in middle temporal visual area of the macaque monkey. I. Selectivity for stimulus direction, speed, and orientation. , 1983, Journal of neurophysiology.

[51]  K. Zhang,et al.  Representation of spatial orientation by the intrinsic dynamics of the head-direction cell ensemble: a theory , 1996, The Journal of neuroscience : the official journal of the Society for Neuroscience.