Mutual Information Rate and Bounds for It

The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems. This quantity, known as the mutual information rate (MIR), is calculated from the mutual information, which is rigorously defined only for random systems. Moreover, the definition of mutual information is based on probabilities of significant events. This work offers a simple alternative way to calculate the MIR in dynamical (deterministic) networks or between two time series (not fully deterministic), and to calculate its upper and lower bounds without having to calculate probabilities, but rather in terms of well known and well defined quantities in dynamical systems. As possible applications of our bounds, we study the relationship between synchronisation and the exchange of information in a system of two coupled maps and in experimental networks of coupled oscillators.

[1]  P. Binder,et al.  Entropy rate estimates from mutual information. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[2]  L. Pezard,et al.  Delay independence of mutual-information rate of two symbolic sequences. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[3]  Murilo S. Baptista,et al.  Density of first Poincaré returns, periodic orbits, and Kolmogorov–Sinai entropy , 2009, 0908.4575.

[4]  Potsdam,et al.  Complex networks in climate dynamics. Comparing linear and nonlinear network construction methods , 2009, 0907.4359.

[5]  M. Morán,et al.  Reduction of noise of large amplitude through adaptive neighborhoods. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[6]  Dimitris Kugiumtzis,et al.  Evaluation of Mutual Information estimators for Time Series , 2009, Int. J. Bifurc. Chaos.

[7]  Murilo S. Baptista,et al.  Finding Quasi-Optimal Network Topologies for Information Transmission in Active Networks , 2008, PloS one.

[8]  J Kurths,et al.  Transmission of information in active networks. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[9]  Paulo C. Rech,et al.  Theoretical and experimental time series analysis of an inductorless Chua’s circuit , 2007 .

[10]  J Kurths,et al.  General framework for phase synchronization through localized sets. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[11]  J B Gao,et al.  Distinguishing chaos from noise by scale-dependent Lyapunov exponent. , 2006, Physical review. E, Statistical, nonlinear, and soft matter physics.

[12]  Jürgen Kurths,et al.  Upper bounds in phase synchronous weak coherent chaotic attractors , 2006 .

[13]  Michel Verleysen,et al.  Mutual information for the selection of relevant variables in spectrometric nonlinear modelling , 2006, ArXiv.

[14]  O. Sporns,et al.  Organization, development and function of complex brain networks , 2004, Trends in Cognitive Sciences.

[15]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[16]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[17]  J. Eckmann Non-Equilibrium Steady States , 2003, math-ph/0304043.

[18]  Carsten O. Daub,et al.  The mutual information: Detecting and evaluating dependencies between variables , 2002, ECCB.

[19]  I. L. Caldas,et al.  Onset of symmetric plasma turbulence , 2001 .

[20]  K. Štěrbová,et al.  Synchronization and information flow in EEGs of epileptic patients. , 2001, IEEE engineering in medicine and biology magazine : the quarterly magazine of the Engineering in Medicine & Biology Society.

[21]  R. Femat,et al.  On the chaos synchronization phenomena , 1999 .

[22]  Jianbo Gao,et al.  Recurrence Time Statistics for Chaotic Systems and Their Applications , 1999 .

[23]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[24]  Grebogi,et al.  Obstructions to shadowing when a Lyapunov exponent fluctuates about zero. , 1994, Physical review letters.

[25]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[26]  Fraser,et al.  Independent coordinates for strange attractors from mutual information. , 1986, Physical review. A, General physics.

[27]  F. Ledrappier,et al.  A proof of the estimation from below in Pesin's entropy formula , 1982, Ergodic Theory and Dynamical Systems.

[28]  R. Gray,et al.  Asymptotically Mean Stationary Measures , 1980 .

[29]  David Ruelle,et al.  An inequality for the entropy of differentiable maps , 1978 .

[30]  Y. Pesin CHARACTERISTIC LYAPUNOV EXPONENTS AND SMOOTH ERGODIC THEORY , 1977 .

[31]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[32]  Miao‐kun Sun,et al.  Trends in cognitive sciences , 2012 .

[33]  Andrzej Stefański,et al.  Lyapunov exponents of systems with noise and fluctuating parameters , 2008 .

[34]  A. Virgillito,et al.  CONCURRENCY AND COMPUTATION : PRACTICE AND EXPERIENCE Concurrency Computat , 2005 .

[35]  J. Kurths,et al.  Synchronization: A Universal Concept in Nonlinear Sciences , 2001 .

[36]  Jürgen Kurths,et al.  Synchronization: Phase locking and frequency entrainment , 2001 .

[37]  N. Chernov,et al.  Decay of correlations for Lorentz gases and hard balls , 2000 .

[38]  Ulrich Parlitz,et al.  Nonlinear Time-Series Analysis , 1998 .

[39]  Mark J van Rossum Neural Computation , 1989, Artificial Intelligence.

[40]  HE Ixtroductiont,et al.  The Bell System Technical Journal , 2022 .