Cusps enable line attractors for neural computation.

Line attractors in neuronal networks have been suggested to be the basis of many brain functions, such as working memory, oculomotor control, head movement, locomotion, and sensory processing. In this paper, we make the connection between line attractors and pulse gating in feed-forward neuronal networks. In this context, because of their neutral stability along a one-dimensional manifold, line attractors are associated with a time-translational invariance that allows graded information to be propagated from one neuronal population to the next. To understand how pulse-gating manifests itself in a high-dimensional, nonlinear, feedforward integrate-and-fire network, we use a Fokker-Planck approach to analyze system dynamics. We make a connection between pulse-gated propagation in the Fokker-Planck and population-averaged mean-field (firing rate) models, and then identify an approximate line attractor in state space as the essential structure underlying graded information propagation. An analysis of the line attractor shows that it consists of three fixed points: a central saddle with an unstable manifold along the line and stable manifolds orthogonal to the line, which is surrounded on either side by stable fixed points. Along the manifold defined by the fixed points, slow dynamics give rise to a ghost. We show that this line attractor arises at a cusp catastrophe, where a fold bifurcation develops as a function of synaptic noise; and that the ghost dynamics near the fold of the cusp underly the robustness of the line attractor. Understanding the dynamical aspects of this cusp catastrophe allows us to show how line attractors can persist in biologically realistic neuronal networks and how the interplay of pulse gating, synaptic coupling, and neuronal stochasticity can be used to enable attracting one-dimensional manifolds and, thus, dynamically control the processing of graded information.

[1]  Ranulfo Romo,et al.  Flexible Control of Mutual Inhibition: A Neural Model of Two-Interval Discrimination , 2005, Science.

[2]  Alexandre Pouget,et al.  Optimal Sensorimotor Integration in Recurrent Cortical Networks: A Neural Implementation of Kalman Filters , 2007, The Journal of Neuroscience.

[3]  Mark S. Goldman,et al.  Memory without Feedback in a Neural Network , 2009, Neuron.

[4]  H S Seung,et al.  How the brain keeps the eyes still. , 1996, Proceedings of the National Academy of Sciences of the United States of America.

[5]  Mark D. Humphries,et al.  Modular Deconstruction Reveals the Dynamical and Physical Building Blocks of a Locomotion Motor Program , 2015, Neuron.

[6]  Andrew T. Sornborger,et al.  A Fokker-Planck approach to graded information propagation in pulse-gated feedforward neuronal networks , 2015, 1512.00520.

[7]  Louis Tao,et al.  KINETIC THEORY FOR NEURONAL NETWORK DYNAMICS , 2006 .

[8]  Xiao-Jing Wang Synaptic reverberation underlying mnemonic persistent activity , 2001, Trends in Neurosciences.

[9]  Ernest Montbri'o,et al.  Macroscopic description for networks of spiking neurons , 2015, 1506.06581.

[10]  Anthony A Grace,et al.  Gating of Hippocampal-Evoked Activity in Prefrontal Cortical Neurons by Inputs from the Mediodorsal Thalamus and Ventral Tegmental Area , 2003, The Journal of Neuroscience.

[11]  Andrew T. Sornborger,et al.  A mechanism for graded, dynamically routable current propagation in pulse-gated synfire chains and implications for information coding , 2015, Journal of Computational Neuroscience.

[12]  Nicolas Brunel,et al.  From Spiking Neuron Models to Linear-Nonlinear Models , 2011, PLoS Comput. Biol..

[13]  W. Newsome,et al.  Context-dependent computation by recurrent dynamics in prefrontal cortex , 2013, Nature.

[14]  Peter E. Latham,et al.  Optimal computation with attractor networks , 2003, Journal of Physiology-Paris.

[15]  W. Gerstner,et al.  Time structure of the activity in neural network models. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[16]  Duane Q. Nykamp,et al.  A Population Density Approach That Facilitates Large-Scale Modeling of Neural Networks: Analysis and an Application to Orientation Tuning , 2004, Journal of Computational Neuroscience.

[17]  Chris Eliasmith,et al.  Fine-Tuning and the Stability of Recurrent Neural Networks , 2011, PloS one.

[18]  H. Sompolinsky,et al.  Theory of orientation tuning in visual cortex. , 1995, Proceedings of the National Academy of Sciences of the United States of America.

[19]  Si Wu,et al.  Sequential Bayesian Decoding with a Population of Neurons , 2003, Neural Computation.

[20]  Mounir Boukadoum,et al.  Mechanisms Gating the Flow of Information in the Cortex: What They Might Look Like and What Their Uses may be , 2010, Front. Comput. Neurosci..

[21]  Andrew T. Sornborger,et al.  Graded, Dynamically Routable Information Processing with Synfire-Gated Synfire Chains , 2015, PLoS Comput. Biol..

[22]  Stefano Panzeri,et al.  The Complexity of Dynamics in Small Neural Circuits , 2015, PLoS Comput. Biol..

[23]  R. Romo,et al.  Neuronal correlates of sensory discrimination in the somatosensory cortex. , 2000, Proceedings of the National Academy of Sciences of the United States of America.

[24]  K. Zhang,et al.  Representation of spatial orientation by the intrinsic dynamics of the head-direction cell ensemble: a theory , 1996, The Journal of neuroscience : the official journal of the Society for Neuroscience.