Mutual information in the frequency domain

Coefficients of mutual information (MI) can provide powerful extensions of classical coefficients of correlation. In particular, they have the property of vanishing if and only if the components involved are statistically independent of each other. This characteristic can prove useful in preparatory work to model building. In this article a frequency domain variant of MI is developed and studied for bivariate stationary time series. As a scientific example an ambient seismic noise data set is studied and a lack of independence of the components inferred. The character of the dependence of the MI on frequency may be used to suggest the nature of the statistical dependence.

[1]  C. Granger,et al.  USING THE MUTUAL INFORMATION COEFFICIENT TO IDENTIFY LAGS IN NONLINEAR MODELS , 1994 .

[2]  Moon,et al.  Estimation of mutual information using kernel density estimators. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[3]  D. Brillinger Time series - data analysis and theory , 1981, Classics in applied mathematics.

[4]  W. J. Langford Statistical Methods , 1959, Nature.

[5]  H. Joe Estimation of entropy and other functionals of a multivariate density , 1989 .

[6]  Wentian Li Mutual information functions versus correlation functions , 1990 .

[7]  N. Mars,et al.  Time delay estimation in non-linear systems using average amount of mutual information analysis , 1982 .

[8]  A. Antos,et al.  Convergence properties of functional estimates for discrete distributions , 2001 .

[9]  G. W. Snedecor Statistical Methods , 1964 .

[10]  J. A. Stewart,et al.  Nonlinear Time Series Analysis , 2015 .

[11]  Max A. Viergever,et al.  Mutual-information-based registration of medical images: a survey , 2003, IEEE Transactions on Medical Imaging.

[12]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[13]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[14]  R. Moddemeijer On estimation of entropy and mutual information of continuous distributions , 1989 .

[15]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[16]  David R. Brillinger,et al.  SECOND-ORDER MOMENTS AND MUTUAL INFORMATION IN THE ANALYSIS OF TIME SERIES , 2002 .

[17]  Rudy Moddemeijer,et al.  A statistic to estimate the variance of the histogram-based mutual information estimator based on dependent pairs of observations , 1999, Signal Process..

[18]  David R. Brillinger,et al.  Some data analyses using mutual information , 2004 .