Non-Euclidean Contractivity of Recurrent Neural Networks

Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, entrainment, and computational efficiency. These properties can all be established through the development of a comprehensive contractivity theory for neural networks. This paper makes three sets of contributions. First, regarding `1/`∞ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights, monotonicity results for principal submatrices, and closed-form worst-case expressions over certain matrix polytopes. Second, regarding nonsmooth contraction theory, we show that the one-sided Lipschitz constant of a Lipschitz vector field equals the essential supremum of the logarithmic norm of its Jacobian. Third, we apply these general results to classes of recurrent neural circuits, including Hopfield, firing rate, Persidskii, Lur’e and other models. For each model, we compute the optimal contraction rate and weighted non-Euclidean norm via a linear program or, in some special cases, via a Hurwitz condition on Metzler majorant of the synaptic matrix. Our non-Euclidean analysis establishes also absolute and total contraction.

[1]  Charles R. Johnson Two submatrix properties of certain induced norms , 1975 .

[2]  Samuel Coogan,et al.  A Contractive Approach to Separable Lyapunov Functions for Monotone Systems , 2017, Autom..

[3]  Mario di Bernardo,et al.  Convergence, Consensus and Synchronization of Complex Networks via Contraction Theory , 2016 .

[4]  Kenneth D. Miller,et al.  Mathematical Equivalence of Two Common Forms of Firing Rate Models of Neural Networks , 2012, Neural Computation.

[5]  Octavian Pastravanu,et al.  Generalized matrix diagonal stability and linear dynamical systems , 2006 .

[6]  Winfried Stefan Lohmiller,et al.  Contraction analysis of nonlinear systems , 1999 .

[7]  Francesco Bullo,et al.  Non-Euclidean Contraction Theory for Monotone and Positive Systems , 2021 .

[8]  Josef Stoer,et al.  Transformations by diagonal matrices in a normed space , 1962 .

[9]  Ian R. Manchester,et al.  A Convex Parameterization of Robust Recurrent Neural Networks , 2021, IEEE Control Systems Letters.

[10]  Sabri Arik,et al.  A note on the global stability of dynamical neural networks , 2002 .

[11]  S. Arik,et al.  Stability analysis of dynamical neural networks , 1997 .

[12]  Francesco Bullo,et al.  Robust Implicit Networks via Non-Euclidean Contractions , 2021, NeurIPS.

[13]  M. Fiedler,et al.  On matrices with non-positive off-diagonal elements and positive principal minors , 1962 .

[14]  Hong Qiao,et al.  Nonlinear measures: a new approach to exponential stability analysis for Hopfield-type neural networks , 2001, IEEE Trans. Neural Networks.

[15]  Ian R. Manchester,et al.  Lipschitz Bounded Equilibrium Networks , 2020, ArXiv.

[16]  Francesco Bullo,et al.  From Contraction Theory to Fixed Point Algorithms on Riemannian and Non-Euclidean Spaces , 2021, ArXiv.

[17]  A. Tesi,et al.  New conditions for global stability of neural networks with application to linear and quadratic programming problems , 1995 .

[18]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[19]  M. Forti,et al.  Necessary and sufficient condition for absolute stability of neural networks , 1994 .

[20]  Mario di Bernardo,et al.  Global Entrainment of Transcriptional Systems to Periodic Inputs , 2009, PLoS Comput. Biol..

[21]  Joan Bruna,et al.  Intriguing properties of neural networks , 2013, ICLR.

[22]  Jinde Cao,et al.  Exponential synchronization of chaotic neural networks: a matrix measure approach , 2009 .

[23]  Nathan van de Wouw,et al.  Convergent Systems: Nonlinear Simplicity , 2017 .

[24]  E. Kaszkurewicz,et al.  On a class of globally stable neural circuits , 1994 .

[25]  Zahra Aminzarey,et al.  Contraction methods for nonlinear systems: A brief introduction and some open problems , 2014, 53rd IEEE Conference on Decision and Control.

[26]  E. Kaszkurewicz,et al.  Matrix diagonal stability in systems and computation , 1999 .

[27]  Jorge Cortés,et al.  Hierarchical Selective Recruitment in Linear-Threshold Brain Networks—Part I: Single-Layer Dynamics and Selective Inhibition , 2018, IEEE Transactions on Automatic Control.

[28]  Jean-Jacques Slotine,et al.  Recursive Construction of Stable Assemblies of Recurrent Neural Networks , 2021, ArXiv.

[29]  A. Berman,et al.  Positive diagonal solutions to the Lyapunov equations , 1978 .

[30]  Manfred Morari,et al.  Safety Verification and Robustness Analysis of Neural Networks via Quadratic Constraints and Semidefinite Programming , 2019, ArXiv.

[31]  J. Farrell,et al.  Qualitative analysis of neural networks , 1989 .

[32]  Huaguang Zhang,et al.  A Comprehensive Review of Stability Analysis of Continuous-Time Recurrent Neural Networks , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[33]  Francesco Bullo,et al.  Non-Euclidean Contraction Theory for Robust Nonlinear Stability , 2021 .

[34]  Venkatesh Saligrama,et al.  RNNs Incrementally Evolving on an Equilibrium Manifold: A Panacea for Vanishing and Exploding Gradients? , 2019, ICLR.

[35]  Anke Meyer-Bäse,et al.  Global Asymptotic Stability of a Class of Dynamical Neural Networks , 2003, Int. J. Neural Syst..

[36]  C. Desoer,et al.  The measure of a matrix as a tool to analyze computer algorithms for circuit analysis , 1972 .

[37]  G. Söderlind,et al.  The logarithmic norm. History and modern theory , 2006 .

[38]  P. Moylan,et al.  Matrices with positive principal minors , 1977 .