Information dynamics with confidence: Using reservoir computing to construct confidence intervals for information-dynamic measures.

Information dynamics provides a broad set of measures for characterizing how a dynamical system stores, processes, and transmits information. While estimators for these measures are commonly used in applications, the statistical properties of these estimators for finite time series are not well understood. In particular, the precision of a given estimate is generally unknown. We develop confidence intervals for generic information-dynamic parameters using a bootstrap procedure. The bootstrap procedure uses an echo state network, a particular instance of a reservoir computer, as a simulator to generate bootstrap samples from a given time series. We perform a Monte Carlo analysis to investigate the performance of the bootstrap confidence intervals in terms of their coverage and expected lengths with two model systems and compare their performance to a simulator based on the random analog predictor. We find that our bootstrap procedure generates confidence intervals with nominal, or near nominal, coverage of the information-dynamic measures, with smaller expected length than the random analog predictor-based confidence intervals. Finally, we demonstrate the applicability of the confidence intervals for characterizing the information dynamics of a time series of sunspot counts.

[1]  Oliver Obst,et al.  Quantifying Long-Range Interactions and Coherent Structure in Multi-Agent Dynamics , 2017, Artificial Life.

[2]  Joseph T. Lizier,et al.  JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems , 2014, Front. Robot. AI.

[3]  B. Pompe,et al.  Permutation entropy: a natural complexity measure for time series. , 2002, Physical review letters.

[4]  R. Brockett,et al.  Reservoir observers: Model-free inference of unmeasured variables in chaotic systems. , 2017, Chaos.

[5]  Guy Theraulaz,et al.  Informative and misinformative interactions in a school of fish , 2017, Swarm Intelligence.

[6]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[7]  David S. Broomhead,et al.  Delay Embeddings for Forced Systems. II. Stochastic Forcing , 2003, J. Nonlinear Sci..

[8]  J. Crutchfield,et al.  Regularities unseen, randomness observed: levels of entropy convergence. , 2001, Chaos.

[9]  A. Seth,et al.  Increased spontaneous MEG signal diversity for psychoactive doses of ketamine, LSD and psilocybin , 2017, Scientific Reports.

[10]  Upmanu Lall,et al.  A Nearest Neighbor Bootstrap For Resampling Hydrologic Time Series , 1996 .

[11]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[12]  J. Richman,et al.  Physiological time-series analysis using approximate entropy and sample entropy. , 2000, American journal of physiology. Heart and circulatory physiology.

[13]  M. B. Rajarshi Bootstrap in Markov-sequences based on estimates of transition density , 1990 .

[14]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[15]  H. Künsch The Jackknife and the Bootstrap for General Stationary Observations , 1989 .

[16]  Ulrich Parlitz,et al.  Gradient based hyperparameter optimization in Echo State Networks , 2019, Neural Networks.

[17]  Andreas Rößler,et al.  Runge-Kutta Methods for the Strong Approximation of Solutions of Stochastic Differential Equations , 2010, SIAM J. Numer. Anal..

[18]  Viola Priesemann,et al.  TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy , 2011, BMC Neuroscience.

[19]  Feng Hu,et al.  Information Dynamics in the Interaction between a Prey and a Predator Fish , 2015, Entropy.

[20]  Viola Priesemann,et al.  Local active information storage as a tool to understand distributed neural information processing , 2013, Front. Neuroinform..

[21]  Reinhold Kliegl,et al.  Twin surrogates to test for complex synchronisation , 2006 .

[22]  David Darmon,et al.  Information Dynamics of a Nonlinear Stochastic Nanopore System , 2018, Entropy.

[23]  Joseph P. Romano,et al.  The stationary bootstrap , 1994 .

[24]  J. F. Gómez,et al.  Active information storage in Parkinson's disease: a resting state fMRI study over the sensorimotor cortex. , 2020 .

[25]  E. Carlstein The Use of Subseries Values for Estimating the Variance of a General Statistic from a Stationary Sequence , 1986 .

[26]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[27]  Jaideep Pathak,et al.  Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data. , 2017, Chaos.

[28]  P. Bühlmann Sieve bootstrap for time series , 1997 .

[29]  P. A. P. Moran,et al.  The statistical analysis of the Canadian Lynx cycle. , 1953 .

[30]  Sushil Jajodia,et al.  Detecting Automation of Twitter Accounts: Are You a Human, Bot, or Cyborg? , 2012, IEEE Transactions on Dependable and Secure Computing.

[31]  Michelle Girvan,et al.  Hybrid Forecasting of Chaotic Processes: Using Machine Learning in Conjunction with a Knowledge-Based Model , 2018, Chaos.

[32]  T. Schreiber,et al.  Surrogate time series , 1999, chao-dyn/9909037.

[33]  David Darmon,et al.  Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data. , 2018, Physical review. E.

[34]  Stefan J. Kiebel,et al.  Re-visiting the echo state property , 2012, Neural Networks.

[35]  J. D. Farmer,et al.  Information Dimension and the Probabilistic Structure of Chaos , 1982 .

[36]  Edward Ott,et al.  Attractor reconstruction by machine learning. , 2018, Chaos.

[37]  Albert Y. Zomaya,et al.  Local information transfer as a spatiotemporal filter for complex systems. , 2008, Physical review. E, Statistical, nonlinear, and soft matter physics.

[38]  Andrew M. Fraser,et al.  Information and entropy in strange attractors , 1989, IEEE Trans. Inf. Theory.

[39]  Igor Melnyk,et al.  Deep learning algorithm for data-driven simulation of noisy dynamical system , 2018, J. Comput. Phys..

[40]  Herbert Jaeger,et al.  Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks , 2013, Neural Computation.

[41]  M. Boly,et al.  Complexity of Multi-Dimensional Spontaneous EEG Decreases during Propofol Induced General Anaesthesia , 2015, PloS one.

[42]  Leonard A. Smith,et al.  Local random analogue prediction of nonlinear processes , 1997 .

[43]  James P. Crutchfield,et al.  Anatomy of a Bit: Information in a Time Series Observation , 2011, Chaos.

[44]  L. A. Aguirre,et al.  Forecasting the Time Series of Sunspot Numbers , 2008 .

[45]  M. Casdagli Chaos and Deterministic Versus Stochastic Non‐Linear Modelling , 1992 .

[46]  David Darmon,et al.  Specific Differential Entropy Rate Estimation for Continuous-Valued Time Series , 2016, Entropy.

[47]  S. Caires,et al.  On the Non-parametric Prediction of Conditionally Stationary Sequences , 2005 .

[48]  David Darmon,et al.  Specific transfer entropy and other state-dependent transfer entropies for continuous-state input-output systems. , 2017, Physical review. E.

[49]  H. Tong,et al.  Threshold Autoregression, Limit Cycles and Cyclical Data , 1980 .

[50]  Joshua Garland,et al.  Leveraging information storage to select forecast-optimal parameters for delay-coordinate reconstructions. , 2016, Physical review. E.

[51]  Erik M. Bollt,et al.  Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings , 2014, 1504.03769.

[52]  Joseph T. Lizier,et al.  Reduced predictable information in brain signals in autism spectrum disorder , 2014, Front. Neuroinform..

[53]  S M Pincus,et al.  Approximate entropy as a measure of system complexity. , 1991, Proceedings of the National Academy of Sciences of the United States of America.

[54]  David Darmon,et al.  Followers Are Not Enough: A Multifaceted Approach to Community Detection in Online Social Networks , 2014, PloS one.

[55]  James Theiler,et al.  Testing for nonlinearity in time series: the method of surrogate data , 1992 .

[56]  J. Kurths,et al.  On forecasting the sunspot numbers , 1990 .

[57]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.