Mutual information, synergy and some curious phenomena for simple channels

Suppose we are allowed to observe two equally noisy versions of some signal X, where the level of the noise is fixed. We are given a choice: we can either observe two independent noisy versions of X, or two correlated ones. We show that, contrary to what classical statistical intuition suggests, it is often the case that correlated data is more valuable than independent data. We investigate this phenomenon in a variety of contexts, we give numerous examples for standard families of channels, and we present general sufficient conditions for deciding this dilemma. One of these conditions draws an interesting connection with the information-theoretic notion of "synergy," which has received a lot of attention in the neuroscience literature recently

[1]  Matsuda,et al.  Physical nature of higher-order mutual information: intrinsic correlations and frustration , 2000, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[2]  Raymond W. Yeung,et al.  A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.

[3]  Y. Peres,et al.  Broadcasting on trees and the Ising model , 2000 .

[4]  Michael J. Berry,et al.  Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.

[5]  William Bialek,et al.  Synergy in a Neural Code , 2000, Neural Computation.

[6]  Elchanan Mossel Reconstruction on Trees: Beating the Second Eigenvalue , 2001 .

[8]  William J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[9]  Te Sun Han Nonnegative Entropy Measures of Multivariate Symmetric Correlations , 1978, Inf. Control..