A large deviation principle for networks of rate neurons with correlated synaptic weights

One of the major goals of mathematical neuroscience is to rigorously justify macroscopic continuum neural-field equations by deriving them from microscopic equations governing the interactions of individual neurons. Since the macroscopic variables are continuous, their value at a particular point is usually interpreted as a mean taken over the neurons in a small neighborhood of the point. In order that the mean is an accurate approximation, it is normally assumed / proved that the neurons are approximately uncorrelated, so that the law of large numbers implies that their average behavior is close to the mean. We develop a model of neural networks with inhomogeneous weights between the neurons, and analyze its behavior as the number of neurons asymptotes to infinity. It will be seen that the inhomogeneity of the weights ensures that the neurons in the limit system are not uncorrelated. Our results thus suggest that the mean-field approximation is insufficient. We study the asymptotic behavior of a network of N firing rate neurons as the number N grows to infinity. The neurons are modeled as lying equally-spaced on a ring. The membrane potential of each neuron evolves according to a discrete time version of the Hopfield or Wilson-Cowan equations [1]. The synaptic weights J(i,j) of presynaptic neuron j and postsynaptic neuron i are modeled as Gaussian Random variables, with identical means that scale as one over N. The covariance between J(i,j) and J(k,l) also scales as one over N times C(i-k,j-l) for some fixed function C. In other words, the covariance is considered to be a function of the 'ring distance' between the presynaptic and postsynaptic. Our main result is that the behavior of the infinite size ensemble of neurons can be described by a simple nonlinear transformation of a spatially stationary (along the ring) Gaussian random process. The nonlinearity is a combination of the firing rate function and the leak. This Gaussian process is described by its mean, the same time-varying function for each neuron, and its covariance operator. The covariance operator describes the correlation between any k-tuple of neurons. It is also stationary in the sense that if we translate each neuron of the k-tuple by the same amount along the ring, the correlation does not change. We have been able to obtain explicitly the equations that describe the mean and covariance of the limit Gaussian process. They form a set of strongly coupled recursive (in time) equations. Our analysis goes beyond the identification of the asymptotic limit of the network. We also prove that in effect the probability law that describes the solutions to the network equations converges exponentially fast toward the previous limit (in a precise mathematical sense) and we have been able to compute the specific rate of convergence thanks to the use of the theory of Large Deviations [2]. This rate of convergence is given explicitly by a function (called the good rate function) defined over the set of all possible asymptotic limit probability laws. We prove that it has a unique minimum at the asymptotic limit. Most modeling of neural networks assumes / proves some sort of thermodynamic limit, whereby if one isolates a particular population of neurons in a localized area of space, they are found to fire increasingly asynchronously as the number in the population asymptotes to infinity, e.g. [3]. However our limit does not possess this property: the nontrivial covariances between the weights ensures that there are large system-wide correlations between the neurons in the asymptotic limit. A very important implication of our result is that the mean-field behavior is insufficient to characterize the behavior of a population. Our work challenges the assumption held by some that one cannot have a concise macroscopic description of a neural network without an assumption of asynchronicity at the local population level. It is a generalization of the work of Moynot and Samuelidies [4].

[1]  Eugene M. Izhikevich,et al.  Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting , 2006 .

[2]  Sompolinsky,et al.  Dynamics of spin systems with randomly asymmetric bonds: Langevin dynamics and a spherical model. , 1987, Physical review. A, General physics.

[3]  Aarnout Brombacher,et al.  Probability... , 2009, Qual. Reliab. Eng. Int..

[4]  M. Alexander,et al.  Principles of Neural Science , 1981 .

[5]  Paul C. Bressloff,et al.  Stochastic Neural Field Theory and the System-Size Expansion , 2009, SIAM J. Appl. Math..

[6]  Wulfram Gerstner,et al.  Spiking Neuron Models , 2002 .

[7]  W. Gerstner,et al.  Time structure of the activity in neural network models. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[8]  J. Baxter,et al.  An Approximation Condition for Large Deviations and Some Applications , 1996 .

[9]  Bruno Cessac,et al.  From neuron to neural networks dynamics , 2006, ArXiv.

[10]  G. B. Arous,et al.  Symmetric Langevin spin glass dynamics , 1997 .

[11]  David Terman,et al.  Mathematical foundations of neuroscience , 2010 .

[12]  Sompolinsky,et al.  Theory of correlations in stochastic neural networks. , 1994, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[13]  M. Samuelides,et al.  Large deviations and mean-field theory for asymmetric random recurrent neural networks , 2002 .

[14]  Pierre Del Moral,et al.  Large Deviations for Interacting Processes in the Strong Topology , 2005 .

[15]  G. B. Arous,et al.  Large deviations for Langevin spin glass dynamics , 1995 .

[16]  Daryl J. Daley,et al.  Introduction to the General Theory of Point Processes , 1998 .

[17]  H. Sompolinsky,et al.  Relaxational dynamics of the Edwards-Anderson model and the mean-field theory of spin-glasses , 1982 .

[18]  Alice Guionnet Dynamique de Langevin d'un verre de spins , 1995 .

[19]  J. Touboul,et al.  Mean-field description and propagation of chaos in networks of Hodgkin-Huxley and FitzHugh-Nagumo neurons , 2012, The Journal of Mathematical Neuroscience.

[20]  María J. Cáceres,et al.  Analysis of nonlinear noisy integrate & fire neuron models: blow-up and steady states , 2010, Journal of mathematical neuroscience.

[21]  Michael A. Buice,et al.  Systematic Fluctuation Expansion for Neural Network Activity Equations , 2009, Neural Computation.

[22]  Olivier Moynot Etude mathematique de la dynamique des reseaux neuronaux aleatoires recurrents , 2000 .

[23]  Sommers,et al.  Chaos in random neural networks. , 1988, Physical review letters.

[24]  G. B. Arous,et al.  Large deviations for Langevin spin glass dynamics , 1995 .

[25]  S. Varadhan,et al.  Large deviations , 2019, Graduate Studies in Mathematics.

[26]  S. Varadhan,et al.  Large deviations for stationary Gaussian processes , 1985 .

[27]  J. Cowan,et al.  Field-theoretic approach to fluctuation effects in neural networks. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[28]  W. Gerstner,et al.  Coherence and incoherence in a globally coupled ensemble of pulse-emitting units. , 1993, Physical review letters.

[29]  Alain Destexhe,et al.  A Master Equation Formalism for Macroscopic Modeling of Asynchronous Irregular Activity States , 2009, Neural Computation.

[30]  J. Gärtner,et al.  Large deviations from the mckean-vlasov limit for weakly interacting diffusions , 1987 .

[31]  D. Dawson,et al.  Multilevel large deviations and interacting diffusions , 1994 .

[32]  Peter Dayan,et al.  Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems , 2001 .

[33]  S. Varadhan,et al.  Asymptotic evaluation of certain Markov process expectations for large time , 1975 .

[34]  A. Guionnet Averaged and quenched propagation of chaos for spin glass dynamics , 1997 .

[35]  B. Cessac Increase in Complexity in Random Neural Networks , 1995 .

[36]  M. Lederman,et al.  Spin-glass dynamics: Relation between theory and experiment: a beginning , 1992 .

[37]  Haim Sompolinsky,et al.  Dynamic Theory of the Spin Glass Phase , 1981 .

[38]  R. Ellis,et al.  Entropy, large deviations, and statistical mechanics , 1985 .

[39]  Glassy behaviour in disordered systems with nonrelaxational dynamics , 1996, cond-mat/9606060.

[40]  E. Olivieri,et al.  Large deviations and metastability: Large deviations and statistical mechanics , 2005 .

[41]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[42]  Sompolinsky,et al.  Dynamics of spin systems with randomly asymmetric bonds: Ising spins and Glauber dynamics. , 1988, Physical review. A, General physics.

[43]  S. Kusuoka,et al.  The large deviation principle for hypermixing processes , 1988 .

[44]  Esko Valkeila,et al.  An Introduction to the Theory of Point Processes, Volume II: General Theory and Structure, 2nd Edition by Daryl J. Daley, David Vere‐Jones , 2008 .

[45]  J. Neveu Processus aléatoires gaussiens , 1968 .

[46]  Diego Fasoli,et al.  Three Applications of GPU Computing in Neuroscience , 2012, Computing in Science & Engineering.