A Robust Estimation of Information Flow in Coupled Nonlinear Systems

Transfer entropy (TE) is a recently proposed measure of the information flow between coupled linear or nonlinear systems. In this study, we first suggest improvements in the selection of parameters for the estimation of TE that significantly enhance its accuracy and robustness in identifying the direction and the level of information flow between observed data series generated by coupled complex systems. Second, a new measure, the net transfer of entropy (NTE), is defined based on TE. Third, we employ surrogate analysis to show the statistical significance of the measures. Fourth, the effect of measurement noise on the measures’ performance is investigated up to \(S/N = 3\) dB. We demonstrate the usefulness of the improved method by analyzing data series from coupled nonlinear chaotic oscillators. Our findings suggest that TE and NTE may play a critical role in elucidating the functional connectivity of complex networks of nonlinear systems.

[1]  E. Beckenbach,et al.  Modern mathematics for the engineer , 1958 .

[2]  A. T. Bharucha-Reid Elements of the theory of Markov processes and their applications , 1961 .

[3]  R. Katz On Some Criteria for Estimating the Order of a Markov Chain , 1981 .

[4]  Schreiber Determination of the noise level of chaotic time series. , 1993, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[5]  Pierre Gaspard,et al.  What can we learn from homoclinic orbits in chaotic dynamics? , 1983 .

[6]  Rodrigo Quian Quiroga,et al.  Nonlinear multivariate analysis of neurophysiological signals , 2005, Progress in Neurobiology.

[7]  G. Rangarajan,et al.  Multiple Nonlinear Time Series with Extended Granger Causality , 2004 .

[8]  Piotr J. Franaszczuk,et al.  Application of the Directed Transfer Function Method to Mesial and Lateral Onset Temporal Lobe Seizures , 2004, Brain Topography.

[9]  Mees,et al.  Mutual information, strange attractors, and the optimal estimation of dimension. , 1992, Physical review. A, Atomic, molecular, and optical physics.

[10]  R. Burke,et al.  Detecting dynamical interdependence and generalized synchrony through mutual prediction in a neural ensemble. , 1996, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[11]  P. Grassberger Finite sample corrections to entropy and dimension estimates , 1988 .

[12]  D. Ruelle Turbulence, strange attractors, and chaos , 1995 .

[13]  Schuster,et al.  Generalized dimensions and entropies from a measured time series. , 1987, Physical review. A, General physics.

[14]  T. Schreiber,et al.  Information transfer in continuous processes , 2002 .

[15]  K. Hlavácková-Schindler,et al.  Causality detection based on information-theoretic approaches in time series analysis , 2007 .

[16]  Panos M. Pardalos,et al.  Data Mining in Biomedicine , 2010 .

[17]  Karl J. Friston Book Review: Brain Function, Nonlinear Coupling, and Neuronal Transients , 2001 .

[18]  Leonidas D. Iasemidis,et al.  Improved Measure of Information Flow in Coupled Nonlinear Systems , 2003, Modelling and Simulation.

[19]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[20]  Leonidas D. Iasemiidis,et al.  QUANTIFICATION OF HIDDEN TIME DEPENDENCIES IN THE EEG WITHIN THE FRAMEWORK OF NONLINEAR DYNAMICS , 1993 .

[21]  S. T. Buckland,et al.  An Introduction to the Bootstrap. , 1994 .

[22]  Leonidas D. Iasemidis,et al.  Information Flow in Coupled Nonlinear Systems: Application to the Epileptic Human Brain , 2007 .

[23]  Theiler,et al.  Spurious dimension from correlation algorithms applied to limited time-series data. , 1986, Physical review. A, General physics.

[24]  R. Quiroga,et al.  Kulback-Leibler and renormalized entropies: applications to electroencephalograms of epilepsy patients. , 1999, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.