Probing the structure–function relationship with neural networks constructed by solving a system of linear equations

Neural network models are an invaluable tool to understand brain function, since they allow to connect the cellular and circuit levels with behaviour. Neural networks usually comprise a huge number of parameters, which must be chosen carefully such that networks reproduce anatomical, behavioural and neurophysiological data. These parameters are usually fitted with off-the-shelf optimization algorithms that iteratively change network parameters and simulate the network to evaluate the changes and improve fitting. Here we propose to invert the fitting process by proceeding from the network dynamics towards network parameters. Firing state transitions are chosen according to the transition graph followed by an agent when solving a given behavioural task. Then, a system of linear equations is constructed from the network firing states and membrane potentials, in such a way that system consistency in guarantee. This allows to uncouple the activity features of the model, like its neurons firing rate and correlation, from the connectivity features and from the task-solving algorithm implemented by the network, allowing to fit these three levels separately. We employed the method to probe the structure-function relationship in a stimuli sequence memory task, finding solution networks where commonly employed optimization algorithms failed. The constructed networks showed reciprocity and correlated firing patterns that recapitulated experimental observations. We argue that the proposed method is a complementary and needed alternative to the way neural networks are constructed to model brain function.

[1]  Christos Constantinidis,et al.  Stable population coding for working memory coexists with heterogeneous neural dynamics in prefrontal cortex , 2016, Proceedings of the National Academy of Sciences.

[3]  P. Lennie The Cost of Cortical Computation , 2003, Current Biology.

[4]  J M Bekkers,et al.  Neurophysiology: Are autapses prodigal synapses? , 1998, Current Biology.

[5]  John W. Phillips,et al.  Sparse recurrent excitatory connectivity in the microcircuit of the adult mouse and human cortex , 2018, bioRxiv.

[6]  K. Bui,et al.  The inner junction complex of the cilia is an interaction hub that involves tubulin post-translational modifications , 2020, eLife.

[7]  Robert Feldt,et al.  Alignment of Requirements Specification and Testing: A Systematic Mapping Study , 2011, 2011 IEEE Fourth International Conference on Software Testing, Verification and Validation Workshops.

[8]  Surya Ganguli,et al.  A deep learning framework for neuroscience , 2019, Nature Neuroscience.

[9]  Morten H. Christiansen,et al.  Learning Recursion: Multiple Nested and Crossed Dependencies , 2011, Biolinguistics.

[10]  J. Gold,et al.  On the nature and use of models in network neuroscience , 2018, Nature Reviews Neuroscience.

[11]  Nikolaus Kriegeskorte,et al.  Cognitive computational neuroscience , 2018, Nature Neuroscience.

[12]  Olaf Sporns,et al.  Can structure predict function in the human brain? , 2010, NeuroImage.

[13]  Michael J. Berry,et al.  Weak pairwise correlations imply strongly correlated network states in a neural population , 2005, Nature.

[14]  C. Petersen,et al.  The Excitatory Neuronal Network of the C2 Barrel Column in Mouse Primary Somatosensory Cortex , 2009, Neuron.

[15]  Wojciech M. Czarnecki,et al.  Grandmaster level in StarCraft II using multi-agent reinforcement learning , 2019, Nature.

[16]  Nicolas Brunel,et al.  Is cortical connectivity optimized for storing information? , 2016, Nature Neuroscience.

[17]  Diego Garlaschelli,et al.  Patterns of link reciprocity in directed networks. , 2004, Physical review letters.

[18]  G. Tamás,et al.  Robust perisomatic GABAergic self-innervation inhibits basket cells in the human and mouse supragranular neocortex , 2019, bioRxiv.

[19]  P. Goldman-Rakic,et al.  Delay-related activity of prefrontal neurons in rhesus monkeys performing delayed response , 1982, Brain Research.

[20]  J. L. de la Pompa,et al.  A novel source of arterial valve cells linked to bicuspid aortic valve without raphe in mice , 2018, eLife.

[21]  P. Goldman-Rakic,et al.  Correlated discharges among putative pyramidal neurons and interneurons in the primate prefrontal cortex. , 2002, Journal of neurophysiology.

[22]  G. Buzsáki,et al.  Preconfigured, skewed distribution of firing rates in the hippocampus and entorhinal cortex. , 2013, Cell reports.

[23]  Wei Ji Ma,et al.  A diverse range of factors affect the nature of neural representations underlying short-term memory , 2018, Nature Neuroscience.

[24]  Paul H. Calamai,et al.  Projected gradient methods for linearly constrained problems , 1987, Math. Program..

[25]  Boris C. Bernhardt,et al.  Gradients of structure–function tethering across neocortex , 2019, Proceedings of the National Academy of Sciences.

[26]  Matthew T. Kaufman,et al.  A neural network that finds a naturalistic solution for the production of muscle activity , 2015, Nature Neuroscience.

[27]  Kenneth O. Stanley,et al.  Deep Neuroevolution: Genetic Algorithms Are a Competitive Alternative for Training Deep Neural Networks for Reinforcement Learning , 2017, ArXiv.

[28]  R. Penrose On best approximate solutions of linear matrix equations , 1956, Mathematical Proceedings of the Cambridge Philosophical Society.

[29]  Xi Chen,et al.  Evolution Strategies as a Scalable Alternative to Reinforcement Learning , 2017, ArXiv.

[30]  Demis Hassabis,et al.  A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play , 2018, Science.

[31]  Matteo Carandini,et al.  Five key factors determining pairwise correlations in visual cortex , 2015, Journal of neurophysiology.

[32]  Yoshua Bengio,et al.  Scaling learning algorithms towards AI , 2007 .

[33]  Jaime de la Rocha,et al.  Supplementary Information for the article ‘ Correlation between neural spike trains increases with firing rate ’ , 2007 .

[34]  O. Sporns,et al.  Complex brain networks: graph theoretical analysis of structural and functional systems , 2009, Nature Reviews Neuroscience.

[35]  Sen Song,et al.  Highly Nonrandom Features of Synaptic Connectivity in Local Cortical Circuits , 2005, PLoS biology.

[36]  Jakob H. Macke,et al.  Analyzing biological and artificial neural networks: challenges with opportunities for synergy? , 2018, Current Opinion in Neurobiology.

[37]  Nicolas Brunel,et al.  A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks , 2015, PLoS Comput. Biol..

[38]  Anders Lansner,et al.  Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models , 2015, PLoS Comput. Biol..

[39]  Konrad Paul Kording Bayesian statistics: relevant for the brain? , 2014, Current Opinion in Neurobiology.

[40]  P. Strata,et al.  Dale’s principle , 1999, Brain Research Bulletin.

[41]  D. Knill,et al.  The Bayesian brain: the role of uncertainty in neural coding and computation , 2004, Trends in Neurosciences.

[42]  Guangyu R. Yang,et al.  Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework , 2016, PLoS Comput. Biol..

[43]  Jean-Baptiste Mouret,et al.  Neural Modularity Helps Organisms Evolve to Learn New Skills without Forgetting Old Skills , 2015, PLoS Comput. Biol..

[44]  Roland R. Regoes,et al.  Investigating the Consequences of Interference between Multiple CD8+ T Cell Escape Mutations in Early HIV Infection , 2016, PLoS Comput. Biol..

[45]  Surya Ganguli,et al.  Universality and individuality in neural dynamics across large populations of recurrent networks , 2019, NeurIPS.

[46]  Ferrante Neri,et al.  Linear Algebra for Computational Sciences and Engineering , 2016, Springer International Publishing.

[47]  Zengcai V. Guo,et al.  Maintenance of persistent activity in a frontal thalamocortical loop , 2017, Nature.