Large deviations and mean-field theory for asymmetric random recurrent neural networks

Abstract In this article, we study the asymptotic dynamics of a noisy discrete time neural network, with random asymmetric couplings and thresholds. More precisely, we focus our interest on the limit behaviour of the network when its size grows to infinity with bounded time. In the case of gaussian connection weights, we use the same techniques as Ben Arous and Guionnet (see [3]) to prove that the image law of the distribution of the neurons' activation states by the empirical measure satisfies a temperature free large deviation principle. Moreover, we prove that if the connection weights satisfy a general condition of domination by gaussian tails, then the distribution of the activation potential of each neuron converges weakly towards an explicit gaussian law, the characteristics of which are contained in the mean-field equations stated by Cessac-Doyon-Quoy-Samuelides (see [4–6]). Furthermore, under this hypothesis, we obtain a law of large numbers and a propagation of chaos result. Finally, we show that many classical distributions on the couplings fulfill our general condition. Thus, this paper provides rigorous mean-field results for a large class of neural networks which is currently investigated in neural network literature.