Some Characterizations of Global Exponential Stability of a Generic Class of Continuous-Time Recurrent Neural Networks

This paper reveals two important characterizations of global exponential stability (GES) of a generic class of continuous-time recurrent neural networks. First, we show that GES of the neural networks can be fully characterized by global asymptotic stability (GAS) of the networks plus the condition that the maximum abscissa of spectral set of Jacobian matrix of the neural networks at the unique equilibrium point is less than zero. This result provides a very useful and direct way to distinguish GES from GAS for the neural networks. Second, we show that when the neural networks have small state feedback coefficients, the supremum of exponential convergence rates (ECRs) of trajectories of the neural networks is exactly equal to the absolute value of the maximum abscissa of spectral set of Jacobian matrix of the neural networks at the unique equilibrium point. Here, the supremum of ECRs indicates the potentially fastest speed of trajectory convergence. The obtained results are helpful in understanding the essence of GES and clarifying the difference between GES and GAS of the continuous-time recurrent neural networks.

[1]  Xue-Bin Liang Equivalence between local exponential stability of the unique equilibrium point and global stability for Hopfield-type neural networks with two neurons , 2000, IEEE Trans. Neural Networks Learn. Syst..

[2]  Lisheng Wang,et al.  Sufficient and necessary conditions for global exponential stability of discrete-time recurrent neural networks , 2006, IEEE Transactions on Circuits and Systems I: Regular Papers.

[3]  M. Forti,et al.  Necessary and sufficient condition for absolute stability of neural networks , 1994 .

[4]  Zhang Yrt Global exponential stability and periodic solutions of delay Hopfield neural networks , 1996 .

[5]  Shun-ichi Amari,et al.  Stability of asymmetric Hopfield networks , 2001, IEEE Trans. Neural Networks.

[6]  Sigurdur Hafstein,et al.  A CONSTRUCTIVE CONVERSE LYAPUNOV THEOREM ON EXPONENTIAL STABILITY , 2004 .

[7]  A. Tesi,et al.  New conditions for global stability of neural networks with application to linear and quadratic programming problems , 1995 .

[8]  Hong Qiao,et al.  A reference model approach to stability analysis of neural networks , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[9]  Wolfgang Hahn,et al.  Stability of Motion , 1967 .

[10]  Jinde Cao,et al.  Estimation on Domain of Attraction and Convergence Rate of Hopfield Continuous Feedback Neural Networks , 2001, J. Comput. Syst. Sci..

[11]  Xue-Bin Liang,et al.  Global exponential stability of neural networks with globally Lipschitz continuous activations and its application to linear variational inequality problem , 2001, IEEE Trans. Neural Networks.

[12]  Anke Meyer-Bäse,et al.  Global Asymptotic Stability of a Class of Dynamical Neural Networks , 2003, Int. J. Neural Syst..

[13]  A. Michel,et al.  Exponential stability and trajectory bounds of neural networks under structural variations , 1990, 29th IEEE Conference on Decision and Control.

[14]  Shun-ichi Amari,et al.  Global Convergence Rate of Recurrently Connected Neural Networks , 2002, Neural Computation.

[15]  Shun-ichi Amari,et al.  New theorems on global convergence of some dynamical systems , 2001, Neural Networks.

[16]  George Seifert On a converse result for Perron’s theorem for asymptotic stability for nonlinear differential equations , 1987 .

[17]  Tharam S. Dillon,et al.  Exponential stability and oscillation of Hopfield graded response neural network , 1994, IEEE Trans. Neural Networks.

[18]  Zhang Yi,et al.  Convergence Analysis of Recurrent Neural Networks , 2003, Network Theory and Applications.

[19]  Jun Wang,et al.  Absolute exponential stability of neural networks with a general class of activation functions , 2000 .

[20]  Pineda,et al.  Generalization of back-propagation to recurrent neural networks. , 1987, Physical review letters.

[21]  Gustaf Söderlind,et al.  On nonlinear difference and differential equations , 1984 .

[22]  J. Farrell,et al.  Qualitative analysis of neural networks , 1988, 1988., IEEE International Symposium on Circuits and Systems.

[23]  Hong Qiao,et al.  A critical analysis on global convergence of Hopfield-type neural networks , 2005, IEEE Transactions on Circuits and Systems I: Regular Papers.

[24]  Hong Qiao,et al.  Nonlinear measures: a new approach to exponential stability analysis for Hopfield-type neural networks , 2001, IEEE Trans. Neural Networks.

[25]  M. Forti,et al.  A condition for global convergence of a class of symmetric neural circuits , 1992 .

[26]  Zhang Yi,et al.  Estimate of exponential convergence rate and exponential stability for neural networks , 1999, IEEE Trans. Neural Networks.

[27]  Xue-Bin Liang,et al.  Comments on "New conditions for global stability of neural networks with application to linear and quadratic programming problems" , 1997 .

[28]  Jun Wang,et al.  Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks , 2002, IEEE Trans. Autom. Control..

[29]  Mauro Forti,et al.  On a class of nonsymmetrical neural networks with application to ADC , 1991 .

[30]  Xue-Bin Liang,et al.  A simple proof of a necessary and sufficient condition for absolute stability of symmetric neural networks , 1998 .

[31]  A. Michel,et al.  Robustness and Perturbation Analysis of a Class of Nonlinear Systems with Applications to Neural Networks , 1993, 1993 American Control Conference.

[32]  Liang Xuebin,et al.  Global exponential stability of Hopfield-type neural network and its applications , 1995 .

[33]  T. Ström On Logarithmic Norms , 1975 .

[34]  Weiping Li,et al.  Applied Nonlinear Control , 1991 .

[35]  E. Kaszkurewicz,et al.  On a class of globally stable neural circuits , 1994 .

[36]  Xue-Bin Liang,et al.  Global exponential stability of a class of neural circuits , 1999 .

[37]  M. Vidyasagar,et al.  Nonlinear systems analysis (2nd ed.) , 1993 .

[38]  Kiyotoshi Matsuoka,et al.  Stability conditions for nonlinear continuous neural networks with asymmetric connection weights , 1992, Neural Networks.

[39]  C. A. Desoer,et al.  Nonlinear Systems Analysis , 1978 .