Information transfer from causal history in complex system dynamics.

In a multivariate evolutionary system, the present state of a variable is a resultant outcome of all interacting variables through the temporal history of the system. How can we quantify the information transfer from the history of all variables to the outcome of a specific variable at a specific time? We develop information theoretic metrics to quantify the information transfer from the entire history, called causal history. Further, we partition this causal history into immediate causal history, as a function of lag τ from the recent time, to capture the influence of recent dynamics, and the complementary distant causal history. Further, each of these influences are decomposed into self- and cross-feedbacks. By employing a Markov property for directed acyclic time-series graph, we reduce the dimensions of the proposed information-theoretic measure to facilitate an efficient estimation algorithm. This approach further reveals an information aggregation property, that is, the information from historical dynamics are accumulated at the preceding time directly influencing the present state of variable(s) of interest. These formulations allow us to analyze complex inter-dependencies in unprecedented ways. We illustrate our approach for: (1) characterizing memory dependency by analyzing a synthetic system with short memory; (2) distinguishing from traditional methods such as lagged mutual information using the Lorenz chaotic model; (3) comparing the memory dependencies of two long-memory processes with and without the strange attractor using the Lorenz model and a linear Ornstein-Uhlenbeck process; and (4) illustrating how dynamics in a complex system is sustained through the interactive contribution of self- and cross-dependencies in both immediate and distant causal histories, using the Lorenz model and observed stream chemistry data known to exhibit 1/f long memory.

[1]  J. Kurths,et al.  From Phase to Lag Synchronization in Coupled Chaotic Oscillators , 1997 .

[2]  Erik M. Bollt,et al.  Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings , 2014, 1504.03769.

[3]  Olivier J. J. Michel,et al.  The relation between Granger causality and directed information theory: a review , 2012, Entropy.

[4]  S. Frenzel,et al.  Partial mutual information for coupling analysis of multivariate time series. , 2007, Physical review letters.

[5]  H. Kleinert,et al.  Rényi’s information transfer between financial time series , 2011, 1106.5913.

[6]  Praveen Kumar,et al.  Information Driven Ecohydrologic Self-Organization , 2010, Entropy.

[7]  Steffen L. Lauritzen,et al.  Independence properties of directed markov fields , 1990, Networks.

[8]  James W. Kirchner,et al.  Measuring catchment-scale chemical retardation using spectral analysis of reactive and passive chemical tracer time series , 2004 .

[9]  Jakob Runge,et al.  Quantifying information transfer and mediation along causal pathways in complex systems. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[10]  M G Cosenza,et al.  Global interactions, information flow, and chaos synchronization. , 2013, Physical review. E, Statistical, nonlinear, and soft matter physics.

[11]  Peishi Jiang,et al.  Interactions of information transfer along separable causal paths. , 2018, Physical review. E.

[12]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[13]  J. Kirchner,et al.  Fractal stream chemistry and its implications for contaminant transport in catchments , 2000, Nature.

[14]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[15]  Jakob Heinzle,et al.  Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity , 2010, Journal of Computational Neuroscience.

[16]  Jürgen Kurths,et al.  Escaping the curse of dimensionality in estimating multivariate transfer entropy. , 2012, Physical review letters.

[17]  Jürgen Jost,et al.  Delays, connection topology, and synchronization of coupled chaotic maps. , 2004, Physical review letters.

[18]  Jürgen Kurths,et al.  Quantifying Causal Coupling Strength: A Lag-specific Measure For Multivariate Time Series Related To Transfer Entropy , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[19]  Martín Ugarte,et al.  An Information-Theoretic Approach to Self-Organisation: Emergence of Complex Interdependencies in Coupled Dynamical Systems , 2018, Entropy.

[20]  Jürgen Kurths,et al.  Identifying causal gateways and mediators in complex spatio-temporal systems , 2015, Nature Communications.

[21]  Praveen Kumar,et al.  Dynamic process connectivity explains ecohydrologic responses to rainfall pulses and drought , 2018, Proceedings of the National Academy of Sciences.

[22]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[23]  M. Eichler Graphical modelling of multivariate time series , 2006, math/0610654.

[24]  Jie Sun,et al.  Causation entropy from symbolic representations of dynamical systems. , 2015, Chaos.

[25]  C. Granger Investigating Causal Relations by Econometric Models and Cross-Spectral Methods , 1969 .

[26]  Colin Neal,et al.  Universal fractal scaling in stream chemistry and its implications for solute transport and water quality trend detection , 2013, Proceedings of the National Academy of Sciences.

[27]  Praveen Kumar,et al.  Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables , 2017 .

[28]  Praveen Kumar,et al.  Temporal Information Partitioning Networks (TIPNets): A process network approach to infer ecohydrologic shifts , 2017 .