Probing High-Order Dependencies With Information Theory

Information theoretic measures (entropies, entropy rates, mutual information) are nowadays commonly used in statistical signal processing for real-world data analysis. This paper proposes the use of auto mutual information (mutual information between subsets of the same signal) and entropy rate as powerful tools to assess refined dependencies of any order in signal temporal dynamics. Notably, it is shown how two-point auto mutual information and entropy rate unveil information conveyed by higher order statistics and, thus, capture details of temporal dynamics that are overlooked by the (two-point) correlation function. Statistical performance of relevant estimators for auto mutual information and entropy rate are studied numerically, by means of Monte Carlo simulations, as functions of sample size, dependence structures, and hyper parameters that enter their definition. Furthermore, it is shown how auto mutual information permits to discriminate between several different non-Gaussian processes, having exactly the same marginal distribution and covariance function. Assessing higher order statistics via multipoint auto mutual information is also shown to unveil the global dependence structure for these processes, indicating that one of the non-Gaussian actually has temporal dynamics that resembles that of a Gaussian process with the same covariance while the other does not.

[1]  Claude E. Shannon,et al.  The mathematical theory of communication , 1950 .

[2]  G. P. Moore,et al.  Statistical signs of synaptic interaction in neurons. , 1970, Biophysical journal.

[3]  F. Takens Detecting strange attractors in turbulence , 1981 .

[4]  D. Rand,et al.  Dynamical Systems and Turbulence, Warwick 1980 : proceedings of a symposium held at the University of Warwick 1979/80 , 1981 .

[5]  F. Anselmet,et al.  High-order velocity structure functions in turbulent shear flows , 1984, Journal of Fluid Mechanics.

[6]  Theiler,et al.  Spurious dimension from correlation algorithms applied to limited time-series data. , 1986, Physical review. A, General physics.

[7]  J. Bronzino,et al.  Autoregressive and bispectral analysis techniques: EEG applications , 1990, IEEE Engineering in Medicine and Biology Magazine.

[8]  A. Kolmogorov The local structure of turbulence in incompressible viscous fluid for very large Reynolds numbers , 1991, Proceedings of the Royal Society of London. Series A: Mathematical and Physical Sciences.

[9]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[10]  P. Gaspard,et al.  Noise, chaos, and (ε,τ)-entropy per unit time , 1993 .

[11]  C. L. Nikias,et al.  Signal processing with higher-order spectra , 1993, IEEE Signal Processing Magazine.

[12]  Ebeling,et al.  Entropies of biosequences: The role of repeats. , 1994, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[13]  C. Granger,et al.  USING THE MUTUAL INFORMATION COEFFICIENT TO IDENTIFY LAGS IN NONLINEAR MODELS , 1994 .

[14]  Vinod Chandran,et al.  Statistics of tricoherence , 1994, IEEE Trans. Signal Process..

[15]  C. Gray,et al.  Chattering Cells: Superficial Pyramidal Neurons Contributing to the Generation of Synchronous Oscillations in the Visual Cortex , 1996, Science.

[16]  Olivier J. J. Michel,et al.  Application of methods based on higher-order statistics for chaotic time series analysis , 1996, Signal Process..

[17]  P. Husar,et al.  Bispectrum analysis of visually evoked potentials , 1997, IEEE Engineering in Medicine and Biology Magazine.

[18]  G. Darbellay,et al.  The mutual information as a measure of statistical dependence , 1997, Proceedings of IEEE International Symposium on Information Theory.

[19]  J. Lacoume,et al.  Statistiques d'ordre supérieur pour le traitement du signal , 1997 .

[20]  P. White,et al.  HIGHER-ORDER SPECTRA: THE BISPECTRUM AND TRISPECTRUM , 1998 .

[21]  Hans-Peter Bernhard,et al.  A tight upper bound on the gain of linear and nonlinear predictors for stationary stochastic processes , 1998, IEEE Trans. Signal Process..

[22]  M K Markey,et al.  Application of the mutual information criterion for feature selection in computer-aided diagnosis. , 2001, Medical physics.

[23]  B. Jain,et al.  The three‐point correlation function in cosmology , 2002, astro-ph/0209167.

[24]  Harshinder Singh,et al.  Nearest Neighbor Estimates of Entropy , 2003 .

[25]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[26]  Dinh-Tuan Pham,et al.  Fast algorithms for mutual information based independent component analysis , 2004, IEEE Transactions on Signal Processing.

[27]  S. Nadarajah,et al.  Expressions for Rényi and Shannon entropies for multivariate distributions , 2005 .

[28]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[29]  Cristian S. Calude The mathematical theory of information , 2007 .

[30]  Bao-qun Yin,et al.  Power-law strength-degree correlation from resource-allocation dynamics on weighted networks. , 2006, Physical review. E, Statistical, nonlinear, and soft matter physics.

[31]  Tyrone E. Duncan Mutual Information for Stochastic Signals and Fractional Brownian Motion , 2008, IEEE Transactions on Information Theory.

[32]  Jaakko Astola,et al.  Application of Bispectrum Estimation for Time-Frequency Analysis of Ground Surveillance Doppler Radar Echo Signals , 2008, IEEE Transactions on Instrumentation and Measurement.

[33]  S. Havlin,et al.  Climate networks around the globe are significantly affected by El Niño. , 2008, Physical review letters.

[34]  E. Fetz,et al.  Decoupling the Cortical Power Spectrum Reveals Real-Time Representation of Individual Finger Movements in Humans , 2009, The Journal of Neuroscience.

[35]  L. Pezard,et al.  Entropy estimation of very short symbolic sequences. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[36]  Paul D. Wilcox,et al.  Application of the bispectrum for detection of small nonlinearities excited sinusoidally , 2010 .

[37]  Olivier J. J. Michel,et al.  On directed information theory and Granger causality graphs , 2010, Journal of Computational Neuroscience.

[38]  T. Matsubara Analytic Minkowski Functionals of the Cosmic Microwave Background: Second-order Non-Gaussianity with Bispectrum and Trispectrum , 2010, 1001.2321.

[39]  Wenjun Zhang Constructing ecological interaction networks by correlation analysis : hints from community sampling , 2011 .

[40]  M. Cohen,et al.  Measuring and interpreting neuronal correlations , 2011, Nature Neuroscience.

[41]  Patrice Abry,et al.  Fast and exact synthesis of stationary multivariate Gaussian time series using circulant embedding , 2011, Signal Process..

[42]  Patrice Abry,et al.  Synthesis of multivariate stationary series with prescribed marginal distributions and covariance using circulant matrix embedding , 2011, Signal Process..

[43]  Jan Khre,et al.  The Mathematical Theory of Information , 2012 .

[44]  G. Hripcsak,et al.  Estimation of time-delayed mutual information and bias for irregularly and sparsely sampled time-series. , 2011, Chaos, solitons, and fractals.

[45]  T. R. Seshadri,et al.  Cosmic microwave background trispectrum and primordial magnetic field limits. , 2011, Physical review letters.

[46]  George Hripcsak,et al.  Using time-delayed mutual information to discover and interpret temporal correlation structure in complex populations , 2011, Chaos.

[47]  Olivier J. J. Michel,et al.  The relation between Granger causality and directed information theory: a review , 2012, Entropy.

[48]  Carlos Granero-Belinchón,et al.  Scaling of information in turbulence , 2016, 1607.05511.

[49]  Kumiko Tanaka-Ishii,et al.  Entropy Rate Estimates for Natural Language - A New Extrapolation of Compressed Large-Scale Corpora , 2016, Entropy.

[50]  Patrice Abry,et al.  Information Theory to Probe Intrapartum Fetal Heart Rate Dynamics , 2017, Entropy.

[51]  Luca Faes,et al.  Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations. , 2017, Physical review. E.

[52]  Pramod Viswanath,et al.  Demystifying fixed k-nearest neighbor information estimators , 2016, 2017 IEEE International Symposium on Information Theory (ISIT).

[53]  Demystifying Fixed k-Nearest Neighbor Information Estimators , 2018, IEEE Trans. Inf. Theory.

[54]  Nitin Tandon,et al.  Mutual Information in Frequency and Its Application to Measure Cross-Frequency Coupling in Epilepsy , 2017, IEEE Transactions on Signal Processing.