Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss

Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of information gain lattices, which allows separating the information that a set of variables contains about another variable into components, interpretable as the unique information of one variable, or redundant and synergy components. In this work, we extend this framework focusing on the lattices that underpin the decomposition. We generalize the type of constructible lattices and examine the relations between different lattices, for example, relating bivariate and trivariate decompositions. We point out that, in information gain lattices, redundancy components are invariant across decompositions, but unique and synergy components are decomposition-dependent. Exploiting the connection between different lattices, we propose a procedure to construct, in the general multivariate case, information gain decompositions from measures of synergy or unique information. We then introduce an alternative type of lattices, information loss lattices, with the role and invariance properties of redundancy and synergy components reversed with respect to gain lattices, and which provide an alternative procedure to build multivariate decompositions. We finally show how information gain and information loss dual lattices lead to a self-consistent unique decomposition, which allows a deeper understanding of the origin and meaning of synergy and redundancy.

[1]  Daniel Polani,et al.  Information Flows in Causal Networks , 2008, Adv. Complex Syst..

[2]  Larissa Albantakis,et al.  From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0 , 2014, PLoS Comput. Biol..

[3]  Christoph Salge,et al.  A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[4]  Mikhail Prokopenko,et al.  Differentiating information transfer and causal effect , 2008, 0812.4373.

[5]  Virgil Griffith,et al.  Synergy, Redundancy and Common Information , 2015, ArXiv.

[6]  Paul L. Williams,et al.  Information dynamics: Its theory and application to embodied cognitive systems , 2011 .

[7]  J. Macke,et al.  Neural population coding: combining insights from microscopic and mass signals , 2015, Trends in Cognitive Sciences.

[8]  D. Anastassiou Computational analysis of the synergy among multiple interacting genes , 2007, Molecular systems biology.

[9]  Nihat Ay,et al.  Hierarchical Quantification of Synergy in Channels , 2016, Front. Robot. AI.

[10]  Daniele Marinazzo,et al.  Synergy and redundancy in the Granger causal analysis of dynamical networks , 2014, New Journal of Physics.

[11]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[12]  Benjamin Flecker,et al.  Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective , 2014, Journal of Computational Neuroscience.

[13]  A. J. Bell THE CO-INFORMATION LATTICE , 2003 .

[14]  Michael J. Berry,et al.  Network information and connected correlations. , 2003, Physical review letters.

[15]  Robin A. A. Ince Measuring multivariate redundant information with pointwise common change in surprisal , 2016, Entropy.

[16]  Karl J. Friston,et al.  Effective connectivity: Influence, causality and biophysical modeling , 2011, NeuroImage.

[17]  Eckehard Olbrich,et al.  Information Decomposition and Synergy , 2015, Entropy.

[18]  James P. Crutchfield,et al.  Multivariate Dependence Beyond Shannon Information , 2016, Entropy.

[19]  A. Pouget,et al.  Neural correlations, population coding and computation , 2006, Nature Reviews Neuroscience.

[20]  Luca Faes,et al.  Estimating the decomposition of predictive information in multivariate systems. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[21]  Tian Zheng,et al.  Inference of Regulatory Gene Interactions from Expression Data Using Three‐Way Mutual Information , 2009, Annals of the New York Academy of Sciences.

[22]  James P. Crutchfield,et al.  Intersection Information Based on Common Randomness , 2013, Entropy.

[23]  S. Panzeri,et al.  An exact method to quantify the information transmitted by different mechanisms of correlational coding. , 2003, Network.

[24]  P. Latham,et al.  Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.

[25]  Daniel Chicharro,et al.  Algorithms of causal inference for the analysis of effective connectivity among brain regions , 2014, Front. Neuroinform..

[26]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[27]  Juliana Y. Rhee,et al.  Acute off-target effects of neural circuit manipulations , 2015, Nature.

[28]  A. Ledberg,et al.  When two become one: the limits of causality analysis of brain dynamics. , 2012, PloS one.

[29]  Zengcai V. Guo,et al.  Neural coding during active somatosensation revealed using illusory touch , 2013, Nature Neuroscience.

[30]  Stefano Panzeri,et al.  On Decoding the Responses of a Population of Neurons from Short Time Windows , 1999, Neural Computation.

[31]  Joseph T. Lizier,et al.  Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).

[32]  P. Latham,et al.  Cracking the Neural Code for Sensory Perception by Combining Statistics, Intervention, and Behavior , 2017, Neuron.

[33]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[34]  Victor Solo,et al.  On causality and mutual information , 2008, 2008 47th IEEE Conference on Decision and Control.

[35]  Stefano Panzeri,et al.  Information-theoretic methods for studying population codes , 2010, Neural Networks.

[36]  Shun-ichi Amari,et al.  Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.

[37]  William J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[38]  Adam B. Barrett,et al.  An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.

[39]  Daniel Chicharro,et al.  A Causal Perspective on the Analysis of Signal and Noise Correlations and Their Role in Population Coding , 2014, Neural Computation.

[40]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[41]  Randall D. Beer,et al.  Generalized Measures of Information Transfer , 2011, ArXiv.

[42]  M. Bethge,et al.  Inferring decoding strategies from choice probabilities in the presence of correlated variability , 2013, Nature Neuroscience.

[43]  Michael J. Berry,et al.  Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.

[44]  E T Rolls,et al.  Correlations and the encoding of information in the nervous system , 1999, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[45]  Martin Brown,et al.  Information-theoretic sensitivity analysis: a general method for credit assignment in complex networks , 2007, Journal of The Royal Society Interface.

[46]  A. Ledberg,et al.  Framework to study dynamic dependencies in networks of interacting processes. , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[47]  Joseph T. Lizier,et al.  Directed Information Measures in Neuroscience , 2014 .

[48]  D. Chicharro,et al.  On the spectral formulation of Granger causality , 2011, Biological Cybernetics.

[49]  Luca Faes,et al.  An Information-Theoretic Framework to Map the Spatiotemporal Dynamics of the Scalp Electroencephalogram , 2016, IEEE Transactions on Biomedical Engineering.

[50]  Jim Kay,et al.  Partial information decomposition as a unified approach to the specification of neural goal functions , 2015, Brain and Cognition.

[51]  Eckehard Olbrich,et al.  Reconsidering unique information: Towards a multivariate information decomposition , 2014, 2014 IEEE International Symposium on Information Theory.

[52]  Robin A. A. Ince The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal , 2017, ArXiv.

[53]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.