A Bivariate Measure of Redundant Information

We define a measure of redundant information based on projections in the space of probability distributions. Redundant information between random variables is information that is shared between those variables. But, in contrast to mutual information, redundant information denotes information that is shared about the outcome of a third variable. Formalizing this concept, and being able to measure it, is required for the non-negative decomposition of mutual information into redundant and synergistic information. Previous attempts to formalize redundant or synergistic information struggle to capture some desired properties. We introduce a new formalism for redundant information and prove that it satisfies all the properties necessary outlined in earlier work, as well as an additional criterion that we propose to be necessary to capture redundancy. We also demonstrate the behavior of this new measure for several examples, compare it to previous measures, and apply it to the decomposition of transfer entropy.

[1]  Viktor Mikhaĭlovich Glushkov,et al.  An Introduction to Cybernetics , 1957, The Mathematical Gazette.

[2]  Dimitri P. Bertsekas,et al.  Nonlinear Programming , 1997 .

[3]  Naftali Tishby,et al.  Synergy and Redundancy among Brain Cells of Behaving Monkeys , 1998, NIPS.

[4]  M R DeWeese,et al.  How to measure the information gained from one symbol. , 1999, Network.

[5]  Ueli Maurer,et al.  Unconditionally Secure Key Agreement and the Intrinsic Conditional Information , 1999, IEEE Trans. Inf. Theory.

[6]  William Bialek,et al.  Adaptive Rescaling Maximizes Information Transmission , 2000, Neuron.

[7]  Touchette,et al.  Information-theoretic limits of control , 1999, Physical review letters.

[8]  Shun-ichi Amari,et al.  Methods of information geometry , 2000 .

[9]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[10]  Naftali Tishby,et al.  The information bottleneck method , 2000, ArXiv.

[11]  Shun-ichi Amari,et al.  Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.

[12]  A. J. Bell THE CO-INFORMATION LATTICE , 2003 .

[13]  Imre Csiszár,et al.  Information projections revisited , 2000, IEEE Trans. Inf. Theory.

[14]  Imre Csiszár,et al.  Information Theory and Statistics: A Tutorial , 2004, Found. Trends Commun. Inf. Theory.

[15]  Seth Lloyd,et al.  Information-theoretic approach to the study of control systems , 2001, physics/0104007.

[16]  P. Latham,et al.  Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.

[17]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[18]  Chris Wiggins,et al.  ARACNE: An Algorithm for the Reconstruction of Gene Regulatory Networks in a Mammalian Cellular Context , 2004, BMC Bioinformatics.

[19]  N. Ay,et al.  A UNIFYING FRAMEWORK FOR COMPLEXITY MEASURES OF FINITE SYSTEMS , 2006 .

[20]  Giulio Tononi,et al.  Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework , 2008, PLoS Comput. Biol..

[21]  Daniel Polani,et al.  Information Flows in Causal Networks , 2008, Adv. Complex Syst..

[22]  Xiaodong Wang,et al.  Gene Regulatory Network Reconstruction Using Conditional Mutual Information , 2008, EURASIP J. Bioinform. Syst. Biol..

[23]  N. Ay,et al.  Complexity measures from interaction structures. , 2008, Physical review. E, Statistical, nonlinear, and soft matter physics.

[24]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[25]  Virgil Griffith Quantifying synergistic information remains an unsolved problem , 2011, ArXiv.

[26]  John M Beggs,et al.  Partial information decomposition as a spatiotemporal filter. , 2011, Chaos.

[27]  Randall D. Beer,et al.  Generalized Measures of Information Transfer , 2011, ArXiv.

[28]  Paul L. Williams,et al.  Information dynamics: Its theory and application to embodied cognitive systems , 2011 .

[29]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[30]  Moritz Grosse-Wentrup,et al.  Quantifying causal influences , 2012, 1203.6502.

[31]  K. Schittkowski,et al.  NONLINEAR PROGRAMMING , 2022 .