Local Stability Analysis of Discrete-Time, Continuous-State, Complex-Valued Recurrent Neural Networks With Inner State Feedback

Recurrent neural networks (RNNs) are well known for their capability to minimize suitable cost functions without the need for a training phase. This is possible because they can be Lyapunov stable. Although the global stability analysis has attracted a lot of interest, local stability is desirable for specific applications. In this brief, we investigate the local asymptotical stability of two classes of discrete-time, continuous-state, complex-valued RNNs with parallel update and inner state feedback. We show that many already known results are special cases of the results obtained here. We also generalize some known results from the real-valued case to the complex-valued one. Finally, we investigate the stability in the presence of time-variant activation functions. Complex-valued activation functions in this brief are separable with respect to the real and imaginary parts.

[1]  Eric Goles Ch.,et al.  Decreasing energy functions as a tool for studying threshold networks , 1985, Discret. Appl. Math..

[2]  Jacek M. Zurada,et al.  Generalized Hopfield networks for associative memories with multi-valued stable states , 1996, Neurocomputing.

[3]  Jürgen Lindner,et al.  A survey of multiuser/multisubchannel detection schemes based on recurrent neural networks , 2002, Wirel. Commun. Mob. Comput..

[4]  P. K. Sahoo,et al.  Two-dimensional Mean Value Theorems and Functional Equations , 1998 .

[5]  R. Westervelt,et al.  Dynamics of iterated-map neural networks. , 1989, Physical review. A, General physics.

[6]  Yasuaki Kuroe,et al.  Qualitative analysis of a self-correlation type complex-valued associative memories , 2001 .

[7]  Zhigang Zeng,et al.  Design and Analysis of High-Capacity Associative Memories Based on a Class of Discrete-Time Recurrent Neural Networks , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[8]  Gouhei Tanaka,et al.  Complex-valued multistate associative memory with nonlinear multilevel functions for gray-level image reconstruction , 2009, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[9]  A. Hirose Nature of complex number and complex-valued neural networks , 2011 .

[10]  Jennie Si,et al.  Analysis and synthesis of a class of discrete-time neural networks with multilevel threshold neurons , 1995, IEEE Trans. Neural Networks.

[11]  D. G. Zill A First Course in Differential Equations: With Modeling Applications , 1951 .

[12]  Yasuaki Kuroe,et al.  On energy function for complex-valued neural networks and its applications , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[13]  Jacek M. Zurada,et al.  Discrete-Time Recurrent Neural Networks With Complex-Valued Linear Threshold Neurons , 2009, IEEE Transactions on Circuits and Systems II: Express Briefs.

[14]  P. K. Sahoo,et al.  Mean Value Theorems and Functional Equations , 1998 .

[15]  Akira Hirose,et al.  Continuous complex-valued back-propagation learning , 1992 .

[16]  Werner G. Teich,et al.  Stability analysis of recurrent neural networks with time-varying activation functions , 2011, Proceedings of the Joint INDS'11 & ISTET'11.

[17]  Zhang Yi,et al.  Multistability of discrete-time recurrent neural networks with unsaturating piecewise linear activation functions , 2004, IEEE Transactions on Neural Networks.

[18]  Mauro Forti,et al.  Convergence of Neural Networks for Programming Problems via a Nonsmooth Łojasiewicz Inequality , 2006, IEEE Transactions on Neural Networks.

[19]  Kazuo Yamanaka,et al.  A Local Property of the Phasor Model of Neural Networks , 1996 .

[20]  Achim Engelhart Vector detection techniques with moderate complexity , 2003 .

[21]  Mauro Forti,et al.  Generalized neural network for nonsmooth nonlinear programming problems , 2004, IEEE Transactions on Circuits and Systems I: Regular Papers.

[22]  Mitsuo Yoshida,et al.  Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems , 2009 .

[23]  Teruyuki Miyajima,et al.  On the Multiuser Detection Using a Neural Network in Code-Division Multiple-Access Communications (Special Issue on Spread Spectrum Techniques and Applications) , 1993 .

[24]  Wen-June Wang,et al.  A multivalued bidirectional associative memory operating on a complex domain , 1998, Neural Networks.

[25]  Jacek M. Zurada,et al.  Complex-valued multistate neural associative memory , 1996, IEEE Trans. Neural Networks.

[26]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[27]  Danilo P. Mandic,et al.  A Complex-Valued RTRL Algorithm for Recurrent Neural Networks , 2004, Neural Computation.

[28]  Eric Goles Ch.,et al.  Energy Functions in Neural Networks with Continuous Local Functions , 1989, Complex Systems.

[29]  M. Seidl,et al.  Code division multiple access communications: multiuser detection based on a recurrent neural network structure , 1996, Proceedings of ISSSTA'95 International Symposium on Spread Spectrum Techniques and Applications.

[30]  Jürgen Lindner,et al.  Global vs. local stability of recurrent neural networks as vector equalizer , 2011, 2011 5th International Conference on Signal Processing and Communication Systems (ICSPCS).

[31]  Weirui Zhao,et al.  Asymptotical stability in discrete-time neural networks , 2002 .