Universality and individuality in neural dynamics across large populations of recurrent networks

Task-based modeling with recurrent neural networks (RNNs) has emerged as a popular way to infer the computational function of different brain regions. These models are quantitatively assessed by comparing the low-dimensional neural representations of the model with the brain, for example using canonical correlation analysis (CCA). However, the nature of the detailed neurobiological inferences one can draw from such efforts remains elusive. For example, to what extent does training neural networks to solve common tasks uniquely determine the network dynamics, independent of modeling architectural choices? Or alternatively, are the learned dynamics highly sensitive to different model choices? Knowing the answer to these questions has strong implications for whether and how we should use task-based RNN modeling to understand brain dynamics. To address these foundational questions, we study populations of thousands of networks, with commonly used RNN architectures, trained to solve neuroscientifically motivated tasks and characterize their nonlinear dynamics. We find the geometry of the RNN representations can be highly sensitive to different network architectures, yielding a cautionary tale for measures of similarity that rely on representational geometry, such as CCA. Moreover, we find that while the geometry of neural dynamics can vary greatly across architectures, the underlying computational scaffold-the topological structure of fixed points, transitions between them, limit cycles, and linearized dynamics-often appears universal across all architectures.

[1]  H. Hotelling Relations Between Two Sets of Variates , 1936 .

[2]  Terrence J. Sejnowski,et al.  The Wilson–Cowan model, 36 years later , 2009, Biological Cybernetics.

[3]  Geoffrey E. Hinton,et al.  On the importance of initialization and momentum in deep learning , 2013, ICML.

[4]  Wei Ji Ma,et al.  A diverse range of factors affect the nature of neural representations underlying short-term memory , 2019, Nature Neuroscience.

[5]  P. Groenen,et al.  Modern Multidimensional Scaling: Theory and Applications , 1999 .

[6]  Matthew T. Kaufman,et al.  A neural network that finds a naturalistic solution for the production of muscle activity , 2015, Nature Neuroscience.

[7]  Quoc V. Le,et al.  Efficient Neural Architecture Search via Parameter Sharing , 2018, ICML.

[8]  John J. Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities , 1999 .

[9]  Chethan Pandarinath,et al.  Inferring single-trial neural population dynamics using sequential auto-encoders , 2017, Nature Methods.

[10]  George Papandreou,et al.  Searching for Efficient Multi-Scale Architectures for Dense Image Prediction , 2018, NeurIPS.

[11]  Les E. Atlas,et al.  Full-Capacity Unitary Recurrent Neural Networks , 2016, NIPS.

[12]  Richard Socher,et al.  Regularizing and Optimizing LSTM Language Models , 2017, ICLR.

[13]  Jascha Sohl-Dickstein,et al.  Capacity and Trainability in Recurrent Neural Networks , 2016, ICLR.

[14]  Alexander Rivkind,et al.  Local Dynamics in Trained Recurrent Neural Networks. , 2015, Physical review letters.

[15]  Devika Narain,et al.  Flexible Sensorimotor Computations through Rapid Reconfiguration of Cortical Dynamics , 2018, Neuron.

[16]  Christof Koch,et al.  A large-scale, standardized physiological survey reveals higher order coding throughout the mouse visual cortex , 2018, bioRxiv.

[17]  Christopher D. Harvey,et al.  Recurrent Network Models of Sequence Generation and Memory , 2016, Neuron.

[18]  Nikolaus Kriegeskorte,et al.  Deep Supervised, but Not Unsupervised, Models May Explain IT Cortical Representation , 2014, PLoS Comput. Biol..

[19]  W. Newsome,et al.  Context-dependent computation by recurrent dynamics in prefrontal cortex , 2013, Nature.

[20]  Devika Narain,et al.  Flexible timing by temporal scaling of cortical responses , 2017, Nature Neuroscience.

[21]  Razvan Pascanu,et al.  Vector-based navigation using grid-like representations in artificial agents , 2018, Nature.

[22]  Francesca Mastrogiuseppe,et al.  A Geometrical Analysis of Global Stability in Trained Feedback Networks , 2019, Neural Computation.

[23]  Daniel L. K. Yamins,et al.  A Task-Optimized Neural Network Replicates Human Auditory Behavior, Predicts Brain Responses, and Reveals a Cortical Processing Hierarchy , 2018, Neuron.

[24]  Razvan Pascanu,et al.  On the difficulty of training recurrent neural networks , 2012, ICML.

[25]  Ha Hong,et al.  Performance-optimized hierarchical models predict neural responses in higher visual cortex , 2014, Proceedings of the National Academy of Sciences.

[26]  H S Seung,et al.  How the brain keeps the eyes still. , 1996, Proceedings of the National Academy of Sciences of the United States of America.

[27]  Surya Ganguli,et al.  Exponential expressivity in deep neural networks through transient chaos , 2016, NIPS.

[28]  Quoc V. Le,et al.  Neural Architecture Search with Reinforcement Learning , 2016, ICLR.

[29]  H. Stanley,et al.  Introduction to Phase Transitions and Critical Phenomena , 1972 .

[30]  Ingmar Kanitscheider,et al.  Training recurrent networks to generate hypotheses about how the brain solves hard navigation problems , 2016, NIPS.

[31]  Kenji Doya,et al.  Universality of Fully-Connected Recurrent Neural Networks , 1993 .

[32]  Jascha Sohl-Dickstein,et al.  SVCCA: Singular Vector Canonical Correlation Analysis for Deep Learning Dynamics and Interpretability , 2017, NIPS.

[33]  Samy Bengio,et al.  Insights on representational similarity in neural networks with canonical correlation , 2018, NeurIPS.

[34]  Surya Ganguli,et al.  On the Expressive Power of Deep Neural Networks , 2016, ICML.

[35]  Jakob H. Macke,et al.  Analyzing biological and artificial neural networks: challenges with opportunities for synergy? , 2018, Current Opinion in Neurobiology.

[36]  N. Kriegeskorte,et al.  Representational geometry: integrating cognition, computation, and the brain , 2013, Trends in Cognitive Sciences.

[37]  Surya Ganguli,et al.  Deep learning models reveal internal structure and diverse computations in the retina under natural scenes , 2018, bioRxiv.

[38]  Geoffrey E. Hinton,et al.  Similarity of Neural Network Representations Revisited , 2019, ICML.

[39]  Pouya Bashivan,et al.  Large-scale, high-resolution comparison of the core visual object recognition behavior of humans, monkeys, and state-of-the-art deep artificial neural networks , 2018 .

[40]  Surya Ganguli,et al.  Deep Learning Models of the Retinal Response to Natural Scenes , 2017, NIPS.

[41]  Boris Polyak Some methods of speeding up the convergence of iteration methods , 1964 .

[42]  D. Thouless Introduction to Phase Transitions and Critical Phenomena , 1972 .

[43]  David Sussillo,et al.  Opening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks , 2013, Neural Computation.

[44]  Geoffrey E. Hinton,et al.  A Simple Way to Initialize Recurrent Networks of Rectified Linear Units , 2015, ArXiv.

[45]  Eric Shea-Brown,et al.  Predictive learning extracts latent space representations from sensory observations , 2019 .

[46]  Il Memming Park,et al.  Gated Recurrent Units Viewed Through the Lens of Continuous Time Dynamical Systems , 2019, Frontiers in Computational Neuroscience.

[47]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[48]  M. Feigenbaum Universal behavior in nonlinear systems , 1983 .

[49]  Surya Ganguli,et al.  A mathematical theory of semantic development in deep neural networks , 2018, Proceedings of the National Academy of Sciences.

[50]  Xiao-Jing Wang,et al.  Task representations in neural networks trained to perform many cognitive tasks , 2019, Nature Neuroscience.

[51]  Omri Barak,et al.  Recurrent neural networks as versatile tools of neuroscience research , 2017, Current Opinion in Neurobiology.

[52]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[53]  Xue-Xin Wei,et al.  Emergence of grid-like representations by training recurrent neural networks to perform spatial localization , 2018, ICLR.

[54]  Sommers,et al.  Chaos in random neural networks. , 1988, Physical review letters.

[55]  Francesca Mastrogiuseppe,et al.  Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks , 2017, Neuron.

[56]  L. Abbott,et al.  From fixed points to chaos: Three models of delayed discrimination , 2013, Progress in Neurobiology.

[57]  David Sussillo,et al.  FixedPointFinder: A Tensorflow toolbox for identifying and characterizing fixed points in recurrent neural networks , 2018, J. Open Source Softw..

[58]  David Sussillo,et al.  Neural circuits as computational dynamical systems , 2014, Current Opinion in Neurobiology.