Partial transfer entropy on rank vectors

For the evaluation of information flow in bivariate time series, information measures have been employed, such as the transfer entropy (TE), the symbolic transfer entropy (STE), defined similarly to TE but on the ranks of the components of the reconstructed vectors, and the transfer entropy on rank vectors (TERV), similar to STE but forming the ranks for the future samples of the response system with regard to the current reconstructed vector. Here we extend TERV for multivariate time series, and account for the presence of confounding variables, called partial transfer entropy on ranks (PTERV). We investigate the asymptotic properties of PTERV, and also partial STE (PSTE), construct parametric significance tests under approximations with Gaussian and gamma null distributions, and show that the parametric tests cannot achieve the power of the randomization test using time-shifted surrogates. Using simulations on known coupled dynamical systems and applying parametric and randomization significance tests, we show that PTERV performs better than PSTE but worse than the partial transfer entropy (PTE). However, PTERV, unlike PTE, is robust to the presence of drifts in the time series and it is also not affected by the level of detrending.

[1]  J. Geweke,et al.  Measurement of Linear Dependence and Feedback between Multiple Time Series , 1982 .

[2]  Michael D. Larsen,et al.  Tests of independence in incomplete multi-way tables using likelihood functions , 2012 .

[3]  M. Roulston Estimating the errors on measured entropy and mutual information , 1999 .

[4]  Dimitris Kugiumtzis,et al.  Detection of Direct Causal Effects and Application to epileptic Electroencephalogram Analysis , 2012, Int. J. Bifurc. Chaos.

[5]  George Sugihara,et al.  Detecting Causality in Complex Ecosystems , 2012, Science.

[6]  Dimitris Kugiumtzis,et al.  Detection of Directionality of Information Transfer in Nonlinear Dynamical Systems , 2009 .

[7]  Luca Faes,et al.  Measuring Connectivity in Linear Multivariate Processes: Definitions, Interpretation, and Practical Analysis , 2012, Comput. Math. Methods Medicine.

[8]  Marco Zaffalon,et al.  Distribution of mutual information from complete and incomplete data , 2004, Comput. Stat. Data Anal..

[9]  Y. Benjamini,et al.  Controlling the false discovery rate: a practical and powerful approach to multiple testing , 1995 .

[10]  Jürgen Kurths,et al.  Escaping the curse of dimensionality in estimating multivariate transfer entropy. , 2012, Physical review letters.

[11]  J. A. Pardo Some applications of the useful mutual information , 1995 .

[12]  Kohei Nakajima,et al.  Permutation Complexity via Duality between Values and Orderings , 2011, ArXiv.

[13]  Zhijun Li,et al.  Inferring Functional Neural Connectivity with Phase Synchronization Analysis: A Review of Methodology , 2012, Comput. Math. Methods Medicine.

[14]  A. Kraskov,et al.  Erratum: Estimating mutual information [Phys. Rev. E 69, 066138 (2004)] , 2011 .

[15]  G. H. Yu,et al.  A distribution free plotting position , 2001 .

[16]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[17]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[18]  L. Pezard,et al.  Entropy estimation of very short symbolic sequences. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[19]  M. Paluš,et al.  Directionality of coupling from bivariate time series: how to avoid false causalities and missed connections. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[20]  Kohei Nakajima,et al.  Symbolic transfer entropy rate is equal to transfer entropy rate for bivariate finite-alphabet stationary ergodic Markov processes , 2011, 1112.2493.

[21]  A. Antos,et al.  Convergence properties of functional estimates for discrete distributions , 2001 .

[22]  Ga Miller,et al.  Note on the bias of information estimates , 1955 .

[23]  C. Granger Investigating causal relations by econometric models and cross-spectral methods , 1969 .

[24]  P. Grassberger Entropy Estimates from Insufficient Samplings , 2003, physics/0307138.

[25]  M. Rosenblum,et al.  Detecting direction of coupling in interacting oscillators. , 2001, Physical review. E, Statistical, nonlinear, and soft matter physics.

[26]  José M. Amigó,et al.  The equality of Kolmogorov–Sinai entropy and metric permutation entropy generalized , 2012 .

[27]  Matthäus Staniek,et al.  Symbolic transfer entropy. , 2008, Physical review letters.

[28]  D. W. Scott,et al.  Multidimensional Density Estimation , 2005 .

[29]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[30]  J. Theiler,et al.  Don't bleach chaotic data. , 1993, Chaos.

[31]  R Quian Quiroga,et al.  Performance of different synchronization measures in real data: a case study on electroencephalographic signals. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[32]  Vasily A. Vakorin,et al.  Confounding effects of indirect connections on causality estimation , 2009, Journal of Neuroscience Methods.

[33]  P. G. Larsson,et al.  Reducing the bias of causality measures. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[34]  H Kantz,et al.  Direction of coupling from phases of interacting oscillators: a permutation information approach. , 2008, Physical review letters.

[35]  B. Pompe,et al.  Permutation entropy: a natural complexity measure for time series. , 2002, Physical review letters.

[36]  P. Grassberger Finite sample corrections to entropy and dimension estimates , 1988 .

[37]  Zaher Dawy,et al.  An approximation to the distribution of finite sample size mutual information estimates , 2005, IEEE International Conference on Communications, 2005. ICC 2005. 2005.

[38]  L. Kocarev,et al.  The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems , 2005, nlin/0503044.

[39]  Luiz A. Baccalá,et al.  Partial directed coherence: a new concept in neural structure determination , 2001, Biological Cybernetics.

[40]  P. Grassberger,et al.  A robust method for detecting interdependences: application to intracranially recorded EEG , 1999, chao-dyn/9907013.

[41]  M. Vinck,et al.  Estimation of the entropy based on its polynomial representation. , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[42]  Karsten Keller,et al.  On the relation of KS entropy and permutation entropy , 2012, 1407.6473.

[43]  H. Quastler Information theory in psychology : problems and methods , 1955 .

[44]  Dimitris Kugiumtzis,et al.  Transfer Entropy on Rank Vectors , 2010, ArXiv.

[45]  Dimitris Kugiumtzis,et al.  Partial Symbolic Transfer Entropy , 2013 .

[46]  Haye Hinrichsen,et al.  Entropy estimates of small data sets , 2008, 0804.4561.

[47]  R. Burke,et al.  Detecting dynamical interdependence and generalized synchrony through mutual prediction in a neural ensemble. , 1996, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[48]  Daniel Chicharro,et al.  Reliable detection of directional couplings using rank statistics. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[49]  Massimiliano Zanin,et al.  Optimizing Functional Network Representation of Multivariate Time Series , 2012, Scientific Reports.

[50]  Yasuo Kuniyoshi,et al.  Methods for Quantifying the Causal Structure of bivariate Time Series , 2007, Int. J. Bifurc. Chaos.

[51]  Julio Ángel Pardo Llorente Some applications of the useful mutual information , 1995 .

[52]  Dimitris Kugiumtzis,et al.  Non-uniform state space reconstruction and coupling detection , 2010, Physical review. E, Statistical, nonlinear, and soft matter physics.

[53]  Alan Agresti,et al.  Nearly exact tests of conditional independence and marginal homogeneity for sparse contingency tables , 1997 .

[54]  B. Pompe,et al.  Momentary information transfer as a coupling measure of time series. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[55]  T. Schurmann,et al.  Bias Analysis in Entropy Estimation , 2004, cond-mat/0403192.