Linking connectivity, dynamics and computations in recurrent neural networks

Large scale recordings of neural activity in behaving animals have established that the transformation of sensory stimuli into motor outputs relies on low-dimensional dynamics at the population level, while individual neurons generally exhibit complex, mixed selectivity. Understanding how low-dimensional computations on mixed, distributed representations emerge from the structure of the recurrent connectivity and inputs to cortical networks is a major challenge. Classical models of recurrent networks fall in two extremes: on one hand balanced networks are based on fully random connectivity and generate high-dimensional spontaneous activity, while on the other hand strongly structured, clustered networks lead to low-dimensional dynamics and ad-hoc computations but rely on pure selectivity. A number of functional approaches for training recurrent networks however suggest that a specific type of minimal connectivity structure is sufficient to implement a large range of computations. Starting from this observation, here we study a new class of recurrent network models in which the connectivity consists of a combination of a random part and a minimal, low dimensional structure. We show that in such low-rank recurrent networks, the dynamics are low-dimensional and can be directly inferred from connectivity using a geometrical approach. We exploit this understanding to determine minimal connectivity structures required to implement specific computations. We find that the dynamical range and computational capacity of a network quickly increases with the dimensionality of the structure in the connectivity, so that a rank-two structure is already sufficient to implement a complex behavioral task such as context-dependent decision-making.

[1]  Eduardo D. Sontag,et al.  Computational Aspects of Feedback in Neural Circuits , 2006, PLoS Comput. Biol..

[2]  H. Sompolinsky,et al.  Transition to chaos in random neuronal networks , 2015, 1508.06486.

[3]  Sen Song,et al.  Highly Nonrandom Features of Synaptic Connectivity in Local Cortical Circuits , 2005, PLoS biology.

[4]  L. F. Abbott,et al.  Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.

[5]  A. Litwin-Kumar,et al.  Slow dynamics and high variability in balanced cortical networks with clustered connections , 2012, Nature Neuroscience.

[6]  Ilya Sutskever,et al.  Learning Recurrent Neural Networks with Hessian-Free Optimization , 2011, ICML.

[7]  H. Sompolinsky,et al.  Chaos in Neuronal Networks with Balanced Excitatory and Inhibitory Activity , 1996, Science.

[8]  C. Salzman,et al.  Abstract Context Representations in Primate Amygdala and Prefrontal Cortex , 2015, Neuron.

[9]  David Sussillo,et al.  Neural circuits as computational dynamical systems , 2014, Current Opinion in Neurobiology.

[10]  Nicolas Brunel,et al.  Dynamics of Sparsely Connected Networks of Excitatory and Inhibitory Spiking Neurons , 2000, Journal of Computational Neuroscience.

[11]  Pentti Kanerva,et al.  Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors , 2009, Cognitive Computation.

[12]  Sandro Romani,et al.  Continuous Attractor Network Model for Conjunctive Position-by-Velocity Tuning of Grid Cells , 2014, PLoS Comput. Biol..

[13]  Srdjan Ostojic,et al.  Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons , 2014, Nature Neuroscience.

[14]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[15]  Merav Stern,et al.  Eigenvalues of block structured asymmetric random matrices , 2014, 1411.2688.

[16]  Xiao-Jing Wang,et al.  Mean-Driven and Fluctuation-Driven Persistent Activity in Recurrent Networks , 2007, Neural Computation.

[17]  P. J. Sjöström,et al.  Functional specificity of local synaptic connections in neocortical networks , 2011, Nature.

[18]  M. Samuelides,et al.  Large deviations and mean-field theory for asymmetric random recurrent neural networks , 2002 .

[19]  Shiino,et al.  Self-consistent signal-to-noise analysis of the statistical behavior of analog neural networks and enhancement of the storage capacity. , 1993, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[20]  Byron M. Yu,et al.  Dimensionality reduction for large-scale neural recordings , 2014, Nature Neuroscience.

[21]  Sommers,et al.  Chaos in random neural networks. , 1988, Physical review letters.

[22]  Merav Stern,et al.  Transition to chaos in random networks with cell-type-specific connectivity. , 2014, Physical review letters.

[23]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[24]  H. Sompolinsky,et al.  Theory of orientation tuning in visual cortex. , 1995, Proceedings of the National Academy of Sciences of the United States of America.

[25]  K. Shenoy,et al.  Temporal complexity and heterogeneity of single-neuron activity in premotor and motor cortex. , 2007, Journal of neurophysiology.

[26]  Chris Eliasmith,et al.  Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems , 2004, IEEE Transactions on Neural Networks.

[27]  Stefano Fusi,et al.  The Sparseness of Mixed Selectivity Neurons Controls the Generalization–Discrimination Trade-Off , 2013, The Journal of Neuroscience.

[28]  Yoram Burakyy,et al.  Accurate Path Integration in Continuous Attractor Network Models of Grid Cells , 2009 .

[29]  K. Harris,et al.  Cortical connectivity and sensory coding , 2013, Nature.

[30]  Gustavo Deco,et al.  Neural Network Mechanisms Underlying Stimulus Driven Variability Reduction , 2012, PLoS Comput. Biol..

[31]  L. Abbott,et al.  Stimulus-dependent suppression of chaos in recurrent neural networks. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[32]  Matthew T. Kaufman,et al.  A neural network that finds a naturalistic solution for the production of muscle activity , 2015, Nature Neuroscience.

[33]  L. F. Abbott,et al.  Building functional networks of spiking model neurons , 2016, Nature Neuroscience.

[34]  Christian K. Machens,et al.  Predictive Coding of Dynamical Variables in Balanced Spiking Networks , 2013, PLoS Comput. Biol..

[35]  R. Romo,et al.  Decoding a Perceptual Decision Process across Cortex , 2010, Neuron.

[36]  Devika Narain,et al.  Flexible timing by temporal scaling of cortical responses , 2017, Nature Neuroscience.

[37]  J Anthony Movshon,et al.  Putting big data to good use in neuroscience , 2014, Nature Neuroscience.

[38]  P. Dayan,et al.  Supporting Online Material Materials and Methods Som Text Figs. S1 to S9 References the Asynchronous State in Cortical Circuits , 2022 .

[39]  Ivan Markovsky,et al.  Low Rank Approximation - Algorithms, Implementation, Applications , 2018, Communications and Control Engineering.

[40]  Francesca Mastrogiuseppe,et al.  Intrinsically-generated fluctuating activity in excitatory-inhibitory networks , 2016, PLoS Comput. Biol..

[41]  William R. Softky,et al.  The highly irregular firing of cortical cells is inconsistent with temporal integration of random EPSPs , 1993, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[42]  Peter E. Latham,et al.  A Balanced Memory Network , 2007, PLoS Comput. Biol..

[43]  Brent Doiron,et al.  Balanced neural architecture and the idling brain , 2014, Front. Comput. Neurosci..

[44]  Andrew M. Clark,et al.  Stimulus onset quenches neural variability: a widespread cortical phenomenon , 2010, Nature Neuroscience.

[45]  G. B. Arous,et al.  Symmetric Langevin spin glass dynamics , 1997 .

[46]  T. Tao Outliers in the spectrum of iid matrices with bounded rank perturbations , 2010 .

[47]  Brent Doiron,et al.  Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models , 2016, PLoS Comput. Biol..

[48]  Dean V. Buonomano,et al.  ROBUST TIMING AND MOTOR PATTERNS BY TAMING CHAOS IN RECURRENT NEURAL NETWORKS , 2012, Nature Neuroscience.

[49]  Nicolas Brunel,et al.  Dynamics of Networks of Excitatory and Inhibitory Neurons in Response to Time-Dependent Inputs , 2011, Front. Comput. Neurosci..

[50]  Matthew T. Kaufman,et al.  Neural population dynamics during reaching , 2012, Nature.

[51]  C. Bordenave,et al.  The circular law , 2012 .

[52]  Christian K. Machens,et al.  Behavioral / Systems / Cognitive Functional , But Not Anatomical , Separation of “ What ” and “ When ” in Prefrontal Cortex , 2009 .

[53]  W. Newsome,et al.  Context-dependent computation by recurrent dynamics in prefrontal cortex , 2013, Nature.

[54]  Sompolinsky,et al.  Storing infinite numbers of patterns in a spin-glass model of neural networks. , 1985, Physical review letters.

[55]  Omri Barak,et al.  Recurrent neural networks as versatile tools of neuroscience research , 2017, Current Opinion in Neurobiology.

[56]  Alexander Rivkind,et al.  Local Dynamics in Trained Recurrent Neural Networks. , 2015, Physical review letters.

[57]  Terence Tao,et al.  Random matrices: Universality of ESDs and the circular law , 2008, 0807.4898.

[58]  Xiao-Jing Wang,et al.  Probabilistic Decision Making by Slow Reverberation in Cortical Circuits , 2002, Neuron.

[59]  Xiao-Jing Wang,et al.  Internal Representation of Task Rules by Recurrent Dynamics: The Importance of the Diversity of Neural Responses , 2010, Front. Comput. Neurosci..

[60]  Misha Tsodyks,et al.  Chaos in Highly Diluted Neural Networks , 1991 .

[61]  Brent Doiron,et al.  Once upon a (slow) time in the land of recurrent neuronal networks… , 2017, Current Opinion in Neurobiology.

[62]  Hilbert J. Kappen,et al.  Learning Universal Computations with Spikes , 2015, PLoS Comput. Biol..

[63]  Surya Ganguli,et al.  On simplicity and complexity in the brave new world of large-scale neuroscience , 2015, Current Opinion in Neurobiology.

[64]  Razvan Pascanu,et al.  On the difficulty of training recurrent neural networks , 2012, ICML.

[65]  Jacques , 1999 .

[66]  C. Clopath,et al.  The emergence of functional microcircuits in visual cortex , 2013, Nature.

[67]  W. Newsome,et al.  The Variable Discharge of Cortical Neurons: Implications for Connectivity, Computation, and Information Coding , 1998, The Journal of Neuroscience.

[68]  D. Amit,et al.  Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex. , 1997, Cerebral cortex.

[69]  Christopher D. Harvey,et al.  Recurrent Network Models of Sequence Generation and Memory , 2016, Neuron.

[70]  Schuster,et al.  Suppressing chaos in neural networks by noise. , 1992, Physical review letters.

[71]  Mark M. Churchland,et al.  Using Firing-Rate Dynamics to Train Recurrent Networks of Spiking Model Neurons , 2016, 1601.07620.

[72]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[73]  Ranulfo Romo,et al.  Flexible Control of Mutual Inhibition: A Neural Model of Two-Interval Discrimination , 2005, Science.

[74]  Xiao-Jing Wang,et al.  The importance of mixed selectivity in complex cognitive tasks , 2013, Nature.

[75]  L. Abbott,et al.  Eigenvalue spectra of random matrices for neural networks. , 2006, Physical review letters.

[76]  P. Goldman-Rakic,et al.  Synaptic mechanisms and network dynamics underlying spatial working memory in a cortical network model. , 2000, Cerebral cortex.

[77]  David Sussillo,et al.  Opening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks , 2013, Neural Computation.

[78]  John J. Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities , 1999 .

[79]  David Hansel,et al.  Asynchronous Rate Chaos in Spiking Neuronal Circuits , 2015 .

[80]  Michael N. Shadlen,et al.  Noise, neural codes and cortical organization , 1994, Current Opinion in Neurobiology.

[81]  L. Abbott,et al.  From fixed points to chaos: Three models of delayed discrimination , 2013, Progress in Neurobiology.