Quantifying Synergistic Information Using Intermediate Stochastic Variables

Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. The proposed measure relies on so-called synergistic random variables (SRVs) which are constructed to have zero mutual information about individual source variables but non-zero mutual information about the complete set of source variables. We prove several basic and desired properties of our measure, including bounds and additivity properties. In addition, we prove several important consequences of our measure, including the fact that different types of synergistic information may co-exist between the same sets of variables. A numerical implementation is provided, which we use to demonstrate that synergy is associated with resilience to noise. Our measure may be a marked step forward in the study of multivariate information theory and its numerous applications.

[1]  Benjamin Flecker,et al.  Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective , 2014, Journal of Computational Neuroscience.

[2]  I︠u︡. V. Linnik,et al.  Decomposition of Random Variables and Vectors , 1977 .

[3]  K. Phoon,et al.  Simulation of strongly non-Gaussian processes using Karhunen–Loeve expansion , 2005 .

[4]  Joseph T. Lizier,et al.  Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).

[5]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[6]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[7]  Michael J. Berry,et al.  Network information and connected correlations. , 2003, Physical review letters.

[8]  Marek Zukowski,et al.  The Essence of Entanglement , 2001, Fundamental Theories of Physics.

[9]  K. Karhunen Zur Spektraltheorie stochastischer prozesse , 1946 .

[10]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[11]  Viola Priesemann,et al.  Bits from Brains for Biologically Inspired Computing , 2014, Front. Robot. AI.

[12]  James P. Crutchfield,et al.  Intersection Information Based on Common Randomness , 2013, Entropy.

[13]  R. Ghanem,et al.  Stochastic Finite Elements: A Spectral Approach , 1990 .

[14]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[15]  Shun-ichi Amari,et al.  Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.

[16]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[17]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[18]  K. Krippendorff Mathematical Theory of Communication , 2009 .

[19]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.

[20]  Claude E. Shannon,et al.  The Mathematical Theory of Communication , 1950 .

[21]  Eckehard Olbrich,et al.  Information Decomposition and Synergy , 2015, Entropy.

[22]  H. Witsenhausen Values and Bounds for the Common Information of Two Discrete Random Variables , 1976 .

[23]  Wei Liu,et al.  Wyners common information for continuous random variables - A lossy source coding interpretation , 2011, 2011 45th Annual Conference on Information Sciences and Systems.

[24]  Michael J. Berry,et al.  Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.

[25]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[26]  P. Latham,et al.  Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.