Stability Analysis of Reservoir Computers Dynamics via Lyapunov Functions

A Lyapunov design method is used to analyze the nonlinear stability of a generic reservoir computer for both the cases of continuous-time and discrete-time dynamics. Using this method, for a given nonlinear reservoir computer, a radial region of stability around a fixed point is analytically determined. We see that the training error of the reservoir computer is lower in the region where the analysis predicts global stability but is also affected by the particular choice of the individual dynamics for the reservoir systems. For the case that the dynamics is polynomial, it appears to be important for the polynomial to have nonzero coefficients corresponding to at least one odd power (e.g., linear term) and one even power (e.g., quadratic term).

[1]  Henry Markram,et al.  The "Liquid Computer": A Novel Strategy for Real-Time Computing on Time Series , 2002 .

[2]  Chaotic Motion in Forced Duffing System Subject to Linear and Nonlinear Damping , 2017 .

[3]  Jürgen Schmidhuber,et al.  Biologically Plausible Speech Recognition with LSTM Neural Nets , 2004, BioADIT.

[4]  Louis M. Pecora,et al.  Network Structure Effects in Reservoir Computers , 2019, Chaos.

[5]  R. Westervelt,et al.  Stability of analog neural networks with delay. , 1989, Physical review. A, General physics.

[6]  A. Roli Artificial Neural Networks , 2012, Lecture Notes in Computer Science.

[7]  Andrew G. Barto,et al.  Lyapunov Design for Safe Reinforcement Learning , 2003, J. Mach. Learn. Res..

[8]  Sarah Marzen,et al.  The difference between memory and prediction in linear recurrent networks , 2017, Physical review. E.

[9]  Serge Massar,et al.  Using a reservoir computer to learn chaotic attractors, with applications to chaos synchronisation and cryptography , 2018, Physical review. E.

[10]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[11]  R. Brockett,et al.  Reservoir observers: Model-free inference of unmeasured variables in chaotic systems. , 2017, Chaos.

[12]  Masanobu Inubushi,et al.  Reservoir Computing Beyond Memory-Nonlinearity Trade-off , 2017, Scientific Reports.

[13]  Benjamin Schrauwen,et al.  Information Processing Capacity of Dynamical Systems , 2012, Scientific Reports.

[14]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[15]  Serge Massar,et al.  High performance photonic reservoir computer based on a coherently driven passive cavity , 2015, ArXiv.

[16]  Serge Massar,et al.  Fully analogue photonic reservoir computer , 2016, Scientific Reports.

[17]  Wesley De Neve,et al.  On the application of reservoir computing networks for noisy image recognition , 2018, Neurocomputing.

[18]  Edward Ott,et al.  Attractor Reconstruction by Machine Learning through Generalized Synchronization , 2018 .

[19]  Thomas L. Carroll Mutual Information and the Edge of Chaos in Reservoir Computers , 2019, ArXiv.

[20]  Daniel Brunner,et al.  Parallel photonic information processing at gigabyte per second data rates using transient states , 2013, Nature Communications.

[21]  W. Maass,et al.  State-dependent computations: spatiotemporal processing in cortical networks , 2009, Nature Reviews Neuroscience.

[22]  Benjamin Schrauwen,et al.  Memory versus non-linearity in reservoirs , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[23]  R. Dickson,et al.  Stability analysis of Hopfield neural networks with uncertainty , 2001 .

[24]  Johan A. K. Suykens,et al.  Artificial neural networks for modelling and control of non-linear systems , 1995 .

[25]  Sue Ann Campbell,et al.  Frustration, Stability, and Delay-Induced Oscillations in a Neural Network Model , 1996, SIAM J. Appl. Math..

[26]  Eduardo D. Sontag,et al.  Mathematical Control Theory: Deterministic Finite Dimensional Systems , 1990 .

[27]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[28]  Miguel C. Soriano,et al.  Photonic delay systems as machine learning implementations , 2015, J. Mach. Learn. Res..

[29]  Jaideep Pathak,et al.  Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach. , 2018, Physical review letters.

[30]  Laurent Larger,et al.  Photonic nonlinear transient computing with multiple-delay wavelength dynamics. , 2012, Physical review letters.

[31]  Ulrich Parlitz,et al.  Observing spatio-temporal dynamics of excitable media using reservoir computing. , 2018, Chaos.

[32]  Sudeshna Sinha,et al.  Introduction to focus issue: intrinsic and designed computation: information processing in dynamical systems--beyond the digital hegemony. , 2010, Chaos.

[33]  Haim Sompolinsky,et al.  Short-term memory in orthogonal neural networks. , 2004, Physical review letters.

[34]  Tao Li,et al.  Information processing via physical soft body , 2015, Scientific Reports.

[35]  Tingwen Huang,et al.  Exponential input-to-state stability of recurrent neural networks with multiple time-varying delays , 2013, Cognitive Neurodynamics.

[36]  Benjamin Schrauwen,et al.  An overview of reservoir computing: theory, applications and implementations , 2007, ESANN.

[37]  Surya Ganguli,et al.  Memory traces in dynamical systems , 2008, Proceedings of the National Academy of Sciences.

[38]  Jürgen Schmidhuber,et al.  LSTM recurrent networks learn simple context-free and context-sensitive languages , 2001, IEEE Trans. Neural Networks.

[39]  Paul Rodríguez,et al.  Simple Recurrent Networks Learn Context-Free and Context-Sensitive Languages by Counting , 2001, Neural Computation.

[40]  Benjamin Schrauwen,et al.  On the Quantification of Dynamics in Reservoir Computing , 2009, ICANN.

[41]  Michael I. Jordan,et al.  Attractor Dynamics in Feedforward Neural Networks , 2000, Neural Computation.

[42]  Laurent Larger,et al.  High-Speed Photonic Reservoir Computing Using a Time-Delay-Based Architecture: Million Words per Second Classification , 2017 .

[43]  Jaideep Pathak,et al.  Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data. , 2017, Chaos.

[44]  Kazuo Tanaka,et al.  Stability and stabilizability of fuzzy-neural-linear control systems , 1995, IEEE Trans. Fuzzy Syst..

[45]  B. Yegnanarayana,et al.  Artificial Neural Networks , 2004 .

[46]  Benjamin Schrauwen,et al.  Reservoir Computing Trends , 2012, KI - Künstliche Intelligenz.

[47]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[48]  Anthony J. Robinson,et al.  An application of recurrent nets to phone probability estimation , 1994, IEEE Trans. Neural Networks.

[49]  W. Haddad,et al.  Nonlinear Dynamical Systems and Control: A Lyapunov-Based Approach , 2008 .