On global asymptotic stability of fully connected recurrent neural networks
暂无分享,去创建一个
Conditions for global asymptotic stability (GAS) of a nonlinear relaxation process realized by a recurrent neural network (RNN) are provided. Existence, convergence, and robustness of such a process are analyzed. This is undertaken based upon the contraction mapping theorem (CMT) and the corresponding fixed point iteration (FPI). Upper bounds for such a process are shown to be the conditions of convergence for a commonly analyzed RNN with a linear state dependence.
[1] Danilo P. Mandic,et al. Relationships Between the A Priori and A Posteriori Errors in Nonlinear Adaptive Neural Filters , 2000, Neural Computation.
[2] Ronald J. Williams,et al. A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.
[3] Liang Jin,et al. Absolute stability conditions for discrete-time recurrent neural networks , 1994, IEEE Trans. Neural Networks.
[4] Yoshikane Takahashi,et al. A neural network theory for constrained optimization , 1999, Neurocomputing.