Global stability of a class of continuous-time recurrent neural networks

This paper investigates global asymptotic stability (GAS) and global exponential stability (GES) of a class of continuous-time recurrent neural networks. First, we introduce a necessary and sufficient condition for the existence and uniqueness of equilibrium of the neural networks with Lipschitz continuous activation functions. Next, we present two sufficient conditions to ascertain the GAS of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. We then give two GES conditions for the neural networks whose activation functions may not be monotone nondecreasing. We also provide a Lyapunov diagonal stability condition, without the nonsingularity requirement for the connection weight matrices, to ascertain the GES of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. This Lyapunov diagonal stability condition generalizes and unifies many of the existing GAS and GES results. Moreover, two higher exponential convergence rates are estimated.

[1]  I. Sandberg Some theorems on the dynamic response of nonlinear transistor networks , 1969 .

[2]  C. A. Desoer,et al.  Nonlinear Systems Analysis , 1978 .

[3]  M. Vidyasagar,et al.  Qualitative Analysis of Large Scale Dynamical Systems , 2012, IEEE Transactions on Systems, Man, and Cybernetics.

[4]  Stephen Grossberg,et al.  Nonlinear neural networks: Principles, mechanisms, and architectures , 1988, Neural Networks.

[5]  T. Roska Some qualitative aspects of neural computing circuits , 1988, 1988., IEEE International Symposium on Circuits and Systems.

[6]  Morris W. Hirsch,et al.  Convergent activation dynamics in continuous time networks , 1989, Neural Networks.

[7]  D. Kelly,et al.  Stability in contractive nonlinear neural networks , 1990, IEEE Transactions on Biomedical Engineering.

[8]  Mauro Forti,et al.  On a class of nonsymmetrical neural networks with application to ADC , 1991 .

[9]  M. Forti,et al.  A condition for global convergence of a class of symmetric neural circuits , 1992 .

[10]  Kiyotoshi Matsuoka,et al.  Stability conditions for nonlinear continuous neural networks with asymmetric connection weights , 1992, Neural Networks.

[11]  Masahiko Morita,et al.  Capacity of associative memory using a nonmonotonic neuron model , 1993, Neural Networks.

[12]  Masahiko Morita,et al.  Associative memory with nonmonotone dynamics , 1993, Neural Networks.

[13]  Robert J. Plemmons,et al.  Nonnegative Matrices in the Mathematical Sciences , 1979, Classics in Applied Mathematics.

[14]  E. Kaszkurewicz,et al.  On a class of globally stable neural circuits , 1994 .

[15]  M. Forti,et al.  Necessary and sufficient condition for absolute stability of neural networks , 1994 .

[16]  A. Tesi,et al.  New conditions for global stability of neural networks with application to linear and quadratic programming problems , 1995 .

[17]  Eugenius Kaszkurewicz,et al.  Comments on "Necessary and sufficient condition for absolute stability of neural networks" , 1995 .

[18]  Toru Yamaguchi,et al.  Necessary and Sufficient Condition for Absolute Exponential Stability of Hopfield-Type Neural Networks , 1996 .

[19]  Xue-Bin Liang,et al.  Comments on "New conditions for global stability of neural networks with application to linear and quadratic programming problems" , 1997 .

[20]  Pauline van den Driessche,et al.  Global Attractivity in Delayed Hopfield Neural Network Models , 1998, SIAM J. Appl. Math..

[21]  Jyh-Ching Juang,et al.  Stability analysis of Hopfield-type neural networks , 1999, IEEE Trans. Neural Networks.

[22]  Zhang Yi,et al.  Estimate of exponential convergence rate and exponential stability for neural networks , 1999, IEEE Trans. Neural Networks.

[23]  Xue-Bin Liang,et al.  Global exponential stability of a class of neural circuits , 1999 .

[24]  Xue-Bin Liang A comment on "On equilibria, stability, and instability of Hopfield neural networks" [and reply] , 2000, IEEE Trans. Neural Networks Learn. Syst..

[25]  Jun Wang,et al.  A recurrent neural network for nonlinear optimization with a continuously differentiable objective function and bound constraints , 2000, IEEE Trans. Neural Networks Learn. Syst..

[26]  X. B. Liwang A comment on "On equilibria, stability, and instability of Hopfield neural networks". , 2000, IEEE transactions on neural networks.

[27]  Yi Qin,et al.  On equilibria, stability, and instability of Hopfield neural networks , 2000, IEEE Trans. Neural Networks Learn. Syst..

[28]  Xue-Bin Liang,et al.  Global exponential stability of neural networks with globally Lipschitz continuous activations and its application to linear variational inequality problem , 2001, IEEE Trans. Neural Networks.

[29]  Jun Wang,et al.  Global asymptotic and exponential stability of a dynamic neural system with asymmetric connection weights , 2001, IEEE Trans. Autom. Control..

[30]  Hong Qiao,et al.  Nonlinear measures: a new approach to exponential stability analysis for Hopfield-type neural networks , 2001, IEEE Trans. Neural Networks.