Online stability of backpropagation-decorrelation recurrent learning

We provide a stability analysis based on nonlinear feedback theory for the recently introduced backpropagation-decorrelation (BPDC) recurrent learning algorithm which adapts only the output weights of a possibly large network and therefore can learn in O(N). Using a small gain criterion, we derive a simple sufficient stability inequality. The condition can be monitored online to assure that the recurrent network is stable and can in principle be applied to any network adapting only the output weights. Based on these results the BPDC learning is further enhanced with an efficient online rescaling algorithm to stabilize the network while adapting. In simulations we find that this mechanism improves learning in the provably stable domain. As byproduct we show that BPDC is highly competitive on standard data sets including the recently introduced CATS benchmark data [CATS data. URL: http://www.cis.hut.fi/lendasse/competition/competition.html].

[1]  D. Wunsch,et al.  Time series prediction with recurrent neural networks using a hybrid PSO-EA algorithm , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[2]  Jochen J. Steil,et al.  Perspectives on learning with recurrent neural networks , 2002, ESANN.

[3]  Jochen J. Steil,et al.  On the weight dynamics of recurrent learning , 2003, ESANN.

[4]  C. A. Desoer,et al.  Nonlinear Systems Analysis , 1978 .

[5]  Jochen J. Steil,et al.  Analyzing the weight dynamics of recurrent learning algorithms , 2005, Neurocomputing.

[6]  Nikita Barabanov,et al.  Stability analysis of discrete-time recurrent neural networks , 2002, IEEE Trans. Neural Networks.

[7]  Henry Markram,et al.  The "Liquid Computer": A Novel Strategy for Real-Time Computing on Time Series , 2002 .

[8]  Jochen J. Steil,et al.  Input output stability of recurrent neural networks , 1999 .

[9]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[10]  Johan A. K. Suykens,et al.  NLq theory: checking and imposing stability of recurrent neural networks for nonlinear modeling , 1997, IEEE Trans. Signal Process..

[11]  Barak A. Pearlmutter Gradient calculations for dynamic recurrent neural networks: a survey , 1995, IEEE Trans. Neural Networks.

[12]  M. Vidyasagar,et al.  Nonlinear systems analysis (2nd ed.) , 1993 .

[13]  Liang Jin,et al.  Stable dynamic backpropagation learning in recurrent neural networks , 1999, IEEE Trans. Neural Networks.

[14]  Amir F. Atiya,et al.  New results on recurrent network training: unifying the algorithms and accelerating convergence , 2000, IEEE Trans. Neural Networks Learn. Syst..

[15]  Jochen J. Steil Stability of backpropagation-decorrelation efficient O(N) recurrent learning , 2005, ESANN.

[16]  Jochen J. Steil,et al.  Local structural stability of recurrent networks with time-varying weights , 2002 .

[17]  Johan A. K. Suykens,et al.  Robust local stability of multilayer recurrent neural networks , 2000, IEEE Trans. Neural Networks Learn. Syst..

[18]  Jochen J. Steil Local stability of recurrent networks with time-varying weights and inputs , 2002, Neurocomputing.

[19]  Amaury Lendasse,et al.  Input Selection for Long-Term Prediction of Time Series , 2005, IWANN.

[20]  Herbert Jaeger,et al.  Adaptive Nonlinear System Identification with Echo State Networks , 2002, NIPS.

[21]  Jouko Lampinen,et al.  Time series prediction by Kalman smoother with cross-validated noise density , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).