Sufficient and necessary conditions for global exponential stability of discrete-time recurrent neural networks

A set of sufficient and necessary conditions are presented for global exponential stability (GES) of a class of generic discrete-time recurrent neural networks. By means of the uncovered conditions, GES and convergence properties of the neural networks are analyzed quantitatively. It is shown that exact equivalences exist among the GES property of the neural networks, the contractiveness of the deduced nonlinear operators, and the global asymptotic stability (GAS) of the neural networks plus the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point less than one. When the neural networks have small state feedback coefficients, it is shown further that the infimum of exponential bounds of the trajectories of the neural networks equals exactly the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point. The obtained results are helpful in understanding essence of GES and clarifying difference between GES and GAS of the discrete-time recurrent neural networks

[1]  Jinde Cao,et al.  Estimation on Domain of Attraction and Convergence Rate of Hopfield Continuous Feedback Neural Networks , 2001, J. Comput. Syst. Sci..

[2]  Jinde Cao,et al.  Global asymptotic stability of a general class of recurrent neural networks with time-varying delays , 2003 .

[3]  María José Pérez-Ilzarbe Convergence analysis of a discrete-time recurrent neural network to perform quadratic real optimization with bound constraints , 1998, IEEE Trans. Neural Networks.

[4]  Zhang Yi,et al.  Estimate of exponential convergence rate and exponential stability for neural networks , 1999, IEEE Trans. Neural Networks.

[5]  R. Marino,et al.  Adaptive observers with arbitrary exponential rate of convergence for nonlinear systems , 1995, IEEE Trans. Autom. Control..

[6]  Pheng-Ann Heng,et al.  Winner-take-all discrete recurrent neural networks , 2000 .

[7]  Philip R. Meyers A Converse to Banach/s Contraction Theorem , 1967 .

[8]  Shun-ichi Amari,et al.  Global Convergence Rate of Recurrently Connected Neural Networks , 2002, Neural Computation.

[9]  Zhang Yrt Global exponential stability and periodic solutions of delay Hopfield neural networks , 1996 .

[10]  Jun Wang,et al.  Global stability of a class of discrete-time recurrent neural networks , 2002 .

[11]  Liang Jin,et al.  Absolute stability conditions for discrete-time recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[12]  Hong Qiao,et al.  A new approach to stability of neural networks with time-varying delays , 2002, Neural Networks.

[13]  Abdesselam Bouzerdoum,et al.  Neural network for quadratic optimization with bound constraints , 1993, IEEE Trans. Neural Networks.

[14]  Eugenius Kaszkurewicz,et al.  Existence and stability of a unique equilibrium in continuous-valued discrete-time asynchronous Hopfield neural networks , 1996, IEEE Trans. Neural Networks.

[15]  Hong Qiao,et al.  Nonlinear measures: a new approach to exponential stability analysis for Hopfield-type neural networks , 2001, IEEE Trans. Neural Networks.

[16]  Gang Feng,et al.  On convergence rate of projection neural networks , 2004, IEEE Trans. Autom. Control..

[17]  A. Michel,et al.  Asymptotic stability of discrete-time systems with saturation nonlinearities with applications to digital filters , 1992 .

[18]  Liang Jin,et al.  Globally asymptotical stability of discrete-time analog neural networks , 1996, IEEE Trans. Neural Networks.

[19]  Nikita Barabanov,et al.  A new method for stability analysis of nonlinear discrete-time systems , 2003, IEEE Trans. Autom. Control..

[20]  Kay Chen Tan,et al.  Global exponential stability of discrete-time neural networks for constrained quadratic optimization , 2004, Neurocomputing.

[21]  Jinde Cao,et al.  Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays , 2004, Neural Networks.

[22]  Kwong-Sak Leung,et al.  Global Exponential Asymptotic Stability in Nonlinear Discrete Dynamical Systems , 2001 .

[23]  Hong Qiao,et al.  A reference model approach to stability analysis of neural networks , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[24]  Ferenc Szidarovszky,et al.  An elementary result in the stability theory of time-invariant nonlinear discrete dynamical systems , 1999, Appl. Math. Comput..

[25]  Edwin K. P. Chong,et al.  An analysis of a class of neural networks for solving linear programming problems , 1999, IEEE Trans. Autom. Control..

[26]  Jinde Cao,et al.  Exponential stability of continuous-time and discrete-time bidirectional associative memory networks with delays , 2004 .

[27]  Zhang Yi,et al.  Multistability of discrete-time recurrent neural networks with unsaturating piecewise linear activation functions , 2004, IEEE Transactions on Neural Networks.

[28]  Lihong Huang,et al.  Exponential stability of discrete-time Hopfield neural networks , 2004 .

[29]  Zhang Yi,et al.  Convergence Analysis of Recurrent Neural Networks , 2003, Network Theory and Applications.

[30]  Solomon Leader A topological characterization of Banach contractions , 1977 .

[31]  A. N. Michel,et al.  Robustness analysis of a class of discrete-time recurrent neural networks under perturbations , 1999 .