Attractor reconstruction by machine learning.

A machine-learning approach called "reservoir computing" has been used successfully for short-term prediction and attractor reconstruction of chaotic dynamical systems from time series data. We present a theoretical framework that describes conditions under which reservoir computing can create an empirical model capable of skillful short-term forecasts and accurate long-term ergodic behavior. We illustrate this theory through numerical experiments. We also argue that the theory applies to certain other machine learning methods for time series prediction.

[1]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[2]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[3]  L. Appeltant,et al.  Information processing using a single dynamical node as complex system , 2011, Nature communications.

[4]  L. Tsimring,et al.  Generalized synchronization of chaos in directionally coupled chaotic systems. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[5]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[6]  Herbert Jaeger,et al.  Echo state network , 2007, Scholarpedia.

[7]  J. Stark Invariant graphs for forced systems , 1997 .

[8]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[9]  Chrisantha Fernando,et al.  Pattern Recognition in a Bucket , 2003, ECAL.

[10]  Miguel C. Soriano,et al.  Reservoir computing with a single time-delay autonomous Boolean node , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.

[11]  Laurent Larger,et al.  High-Speed Photonic Reservoir Computing Using a Time-Delay-Based Architecture: Million Words per Second Classification , 2017 .

[12]  Jaideep Pathak,et al.  Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data. , 2017, Chaos.

[13]  Ulrich Parlitz,et al.  Observing spatio-temporal dynamics of excitable media using reservoir computing. , 2018, Chaos.

[14]  Petros Koumoutsakos,et al.  Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks , 2018, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[15]  Carroll,et al.  Synchronization in chaotic systems. , 1990, Physical review letters.

[16]  M. Rabinovich,et al.  Stochastic synchronization of oscillation in dissipative systems , 1986 .

[17]  L. F. Abbott,et al.  Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.

[18]  Michelle Girvan,et al.  Hybrid Forecasting of Chaotic Processes: Using Machine Learning in Conjunction with a Knowledge-Based Model , 2018, Chaos.

[19]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[20]  Stefan J. Kiebel,et al.  Re-visiting the echo state property , 2012, Neural Networks.

[21]  R. Brockett,et al.  Reservoir observers: Model-free inference of unmeasured variables in chaotic systems. , 2017, Chaos.

[22]  J. Yorke,et al.  Differentiable generalized synchronization of chaos , 1997 .

[23]  Parlitz,et al.  Generalized synchronization, predictability, and equivalence of unidirectionally coupled dynamical systems. , 1996, Physical review letters.

[24]  Benjamin Schrauwen,et al.  A comparative study of Reservoir Computing strategies for monthly time series prediction , 2010, Neurocomputing.