Quantifying Redundant Information in Predicting a Target Random Variable

We consider the problem of defining a measure of redundant information that quantifies how much common information two or more random variables specify about a target random variable. We discussed desired properties of such a measure, and propose new measures with some desirable properties.

[1]  Michael J. Berry,et al.  Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.

[2]  Eckehard Olbrich,et al.  Reconsidering unique information: Towards a multivariate information decomposition , 2014, 2014 IEEE International Symposium on Information Theory.

[3]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[4]  D. Anastassiou Computational analysis of the synergy among multiple interacting genes , 2007, Molecular systems biology.

[5]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[6]  P. Latham,et al.  Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.

[7]  M. Laubach,et al.  Redundancy and Synergy of Neuronal Ensembles in Motor Cortex , 2005, The Journal of Neuroscience.

[8]  Stefan Wolf,et al.  Zero-error information and applications in cryptography , 2004, Information Theory Workshop.

[9]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.

[10]  Joseph T. Lizier,et al.  Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).

[11]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[12]  Abbas El Gamal,et al.  Exact common information , 2014, 2014 IEEE International Symposium on Information Theory.

[13]  Michael J. Berry,et al.  Network information and connected correlations. , 2003, Physical review letters.

[14]  Christoph Salge,et al.  A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[15]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[16]  Sheila Nirenberg,et al.  Decoding neuronal spike trains: How important are correlations? , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[17]  Giulio Tononi,et al.  Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework , 2008, PLoS Comput. Biol..

[18]  James P. Crutchfield,et al.  Intersection Information Based on Common Randomness , 2013, Entropy.

[19]  K. Shadan,et al.  Available online: , 2012 .