Estimation of Directed Dependencies in Time Series Using Conditional Mutual Information and Non-linear Prediction

It is well-known that estimation of the directed dependency between high-dimensional data sequences suffers from the "curse of dimensionality" problem. To reduce the dimensionality of the data, and thereby improve the accuracy of the estimation, we propose a new progressive input variable selection technique. Specifically, in each iteration, the remaining input variables are ranked according to a weighted sum of the amount of new information provided by the variable and the variable’s prediction accuracy. Then, the highest ranked variable is included, if it is significant enough to improve the accuracy of the prediction. A simulation study on synthetic non-linear autoregressive and Henon maps data, shows a significant improvement over existing estimator, especially in the case of small amounts of high-dimensional and highly correlated data.

[1]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[2]  Milan S. Derpich,et al.  Fundamental Inequalities and Identities Involving Mutual and Directed Informations in Closed-Loop Systems , 2013, ArXiv.

[3]  Jian Zhang,et al.  Low-dimensional approximation searching strategy for transfer entropy from non-uniform embedding , 2018, PloS one.

[4]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[5]  J. Massey CAUSALITY, FEEDBACK AND DIRECTED INFORMATION , 1990 .

[6]  Haim H. Permuter,et al.  Universal Estimation of Directed Information , 2010, IEEE Transactions on Information Theory.

[7]  Olivier J. J. Michel,et al.  On directed information theory and Granger causality graphs , 2010, Journal of Computational Neuroscience.

[8]  Luca Faes,et al.  MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy , 2014, PloS one.

[9]  Joseph T. Lizier,et al.  An Introduction to Transfer Entropy , 2016, Springer International Publishing.

[10]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[11]  Luca Faes,et al.  An Information-Theoretic Framework to Map the Spatiotemporal Dynamics of the Scalp Electroencephalogram , 2016, IEEE Transactions on Biomedical Engineering.

[12]  Holger R. Maier,et al.  Non-linear variable selection for artificial neural networks using partial mutual information , 2008, Environ. Model. Softw..

[13]  Jörg Kliewer,et al.  Directional and Causal Information Flow in EEG for Assessing Perceived Audio Quality , 2017, IEEE Transactions on Molecular, Biological and Multi-Scale Communications.

[14]  Ghasem Azemi,et al.  Classifying Single-Trial EEG During Motor Imagery Using a Multivariate Mutual Information Based Phase Synchrony Measure , 2017, 2017 24th National and 2nd International Iranian Conference on Biomedical Engineering (ICBME).

[15]  Luca Faes,et al.  Estimating the decomposition of predictive information in multivariate systems. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[16]  Luca Faes,et al.  Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations. , 2017, Physical review. E.

[17]  Boualem Boashash,et al.  A novel multivariate phase synchrony measure: Application to multichannel newborn EEG analysis , 2019, Digit. Signal Process..

[18]  Douglas A. Baxter,et al.  Inferring neuronal network functional connectivity with directed information. , 2017, Journal of neurophysiology.