Learn to Synchronize, Synchronize to Learn

In recent years, the artificial intelligence community has seen a continuous interest in research aimed at investigating dynamical aspects of both training procedures and machine learning models. Of particular interest among recurrent neural networks, we have the Reservoir Computing (RC) paradigm characterized by conceptual simplicity and a fast training scheme. Yet, the guiding principles under which RC operates are only partially understood. In this work, we analyze the role played by Generalized Synchronization (GS) when training a RC to solve a generic task. In particular, we show how GS allows the reservoir to correctly encode the system generating the input signal into its dynamics. We also discuss necessary and sufficient conditions for the learning to be feasible in this approach. Moreover, we explore the role that ergodicity plays in this process, showing how its presence allows the learning outcome to apply to multiple input trajectories. Finally, we show that satisfaction of the GS can be measured by means of the mutual false nearest neighbors index, which makes effective to practitioners theoretical derivations.

[1]  Alexander Rivkind,et al.  Local Dynamics in Trained Recurrent Neural Networks. , 2015, Physical review letters.

[2]  R. Brockett,et al.  Reservoir observers: Model-free inference of unmeasured variables in chaotic systems. , 2017, Chaos.

[3]  Devika Subramanian,et al.  Data-driven predictions of a multiscale Lorenz 96 chaotic system using machine-learning methods: reservoir computing, artificial neural network, and long short-term memory network , 2020, Nonlinear Processes in Geophysics.

[4]  W. Gilpin Deep reconstruction of strange attractors from time series , 2020, NeurIPS.

[5]  Juan-Pablo Ortega,et al.  Echo state networks are universal , 2018, Neural Networks.

[6]  Stefan J. Kiebel,et al.  Re-visiting the echo state property , 2012, Neural Networks.

[7]  Sebastián Basterrech,et al.  Empirical analysis of the necessary and sufficient conditions of the echo state property , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).

[8]  Jaideep Pathak,et al.  Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics , 2019, Neural Networks.

[9]  Yue Joseph Wang,et al.  Nonlinear System Modeling With Random Matrices: Echo State Networks Revisited , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[10]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[11]  Ulrich Parlitz Detecting generalized synchronization , 2012 .

[12]  Peter Tiño,et al.  Predicting the Future of Discrete Sequences from Fractal Representations of the Past , 2001, Machine Learning.

[13]  E. Ott Chaos in Dynamical Systems: Contents , 2002 .

[14]  Di Qi,et al.  Using machine learning to predict extreme events in complex systems , 2019, Proceedings of the National Academy of Sciences.

[15]  Steven L. Brunton,et al.  On dynamic mode decomposition: Theory and applications , 2013, 1312.0041.

[16]  Sarah Marzen,et al.  The difference between memory and prediction in linear recurrent networks , 2017, Physical review. E.

[17]  Carroll,et al.  Synchronization in chaotic systems. , 1990, Physical review letters.

[18]  Francesca Mastrogiuseppe,et al.  A Geometrical Analysis of Global Stability in Trained Feedback Networks , 2019, Neural Computation.

[19]  Parlitz,et al.  Generalized synchronization, predictability, and equivalence of unidirectionally coupled dynamical systems. , 1996, Physical review letters.

[20]  Benjamin Schrauwen,et al.  Memory versus non-linearity in reservoirs , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[21]  Shai Ben-David,et al.  Understanding Machine Learning: From Theory to Algorithms , 2014 .

[22]  Yoshua Bengio,et al.  Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.

[23]  Louis M Pecora,et al.  Synchronization of chaotic systems. , 2015, Chaos.

[24]  Michael Small,et al.  The reservoir's perspective on generalized synchronization. , 2019, Chaos.

[25]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[26]  Lorenzo Livi,et al.  Investigating echo state networks dynamics by means of recurrence analysis , 2016, IEEE Trans. Neural Networks Learn. Syst..

[27]  S. Boccaletti,et al.  Synchronization of chaotic systems , 2001 .

[28]  L. F. Abbott,et al.  Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.

[29]  Input representation in recurrent neural networks dynamics , 2020, ArXiv.

[30]  Danielle S Bassett,et al.  Invertible generalized synchronization: A putative mechanism for implicit learning in neural systems. , 2020, Chaos.

[31]  Christof Teuscher,et al.  Memory and Information Processing in Recurrent Neural Networks , 2016, ArXiv.

[32]  Surya Ganguli,et al.  Memory traces in dynamical systems , 2008, Proceedings of the National Academy of Sciences.

[33]  Allen G. Hart,et al.  Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems , 2020, Physica D: Nonlinear Phenomena.

[34]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[35]  Nello Cristianini,et al.  Kernel Methods for Pattern Analysis , 2004 .

[36]  John Harlim,et al.  Bridging data science and dynamical systems theory , 2020 .

[37]  Benjamin Schrauwen,et al.  The spectral radius remains a valid indicator of the Echo state property for large reservoirs , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).

[38]  Evangelos A. Theodorou,et al.  Deep Learning Theory Review: An Optimal Control and Dynamical Systems Perspective , 2019, ArXiv.

[39]  L. Tsimring,et al.  Generalized synchronization of chaos in directionally coupled chaotic systems. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[40]  Michael Small,et al.  Synchronization of chaotic systems and their machine-learning models. , 2019, Physical review. E.

[41]  G. Birkhoff Proof of the Ergodic Theorem , 1931, Proceedings of the National Academy of Sciences.

[42]  Toshiyuki Yamane,et al.  Recent Advances in Physical Reservoir Computing: A Review , 2018, Neural Networks.

[43]  O. Rössler An equation for continuous chaos , 1976 .

[44]  D. Gu'ery-Odelin,et al.  Accuracy of neural networks for the simulation of chaotic dynamics: precision of training data vs precision of the algorithm , 2020, Chaos.

[45]  Edward Ott,et al.  Attractor reconstruction by machine learning. , 2018, Chaos.

[46]  M. Rabinovich,et al.  Stochastic synchronization of oscillation in dissipative systems , 1986 .

[47]  JAKE BOUVRIE,et al.  Kernel Methods for the Approximation of Nonlinear Systems , 2011, SIAM J. Control. Optim..

[48]  Allen G Hart,et al.  Embedding and approximation theorems for echo state networks , 2019, Neural Networks.

[49]  Nello Cristianini,et al.  Kernel Methods for Pattern Analysis , 2003, ICTAI.

[50]  Claudio Gallicchio,et al.  Deep reservoir computing: A critical experimental analysis , 2017, Neurocomputing.

[51]  Louis M. Pecora,et al.  Fundamentals of synchronization in chaotic systems, concepts, and applications. , 1997, Chaos.

[52]  Min Han,et al.  Support Vector Echo-State Machine for Chaotic Time-Series Prediction , 2007, IEEE Transactions on Neural Networks.

[53]  F. Takens Detecting strange attractors in turbulence , 1981 .

[54]  Peter Tino,et al.  Dynamical Systems as Temporal Feature Spaces , 2019, J. Mach. Learn. Res..

[55]  Herbert Jaeger,et al.  Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks , 2013, Neural Computation.

[56]  Robert Jenssen,et al.  Training Echo State Networks with Regularization Through Dimensionality Reduction , 2016, Cognitive Computation.

[57]  W. Gilpin Deep learning of dynamical attractors from time series measurements , 2020, ArXiv.

[58]  Pyragas,et al.  Weak and strong synchronization of chaos. , 1996, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[59]  E. Lorenz Deterministic nonperiodic flow , 1963 .