An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems

To fully characterize the information that two source variables carry about a third target variable, one must decompose the total information into redundant, unique, and synergistic components, i.e., obtain a partial information decomposition (PID). However, Shannon's theory of information does not provide formulas to fully determine these quantities. Several recent studies have begun addressing this. Some possible definitions for PID quantities have been proposed and some analyses have been carried out on systems composed of discrete variables. Here we present an in-depth analysis of PIDs on Gaussian systems, both static and dynamical. We show that, for a broad class of Gaussian systems, previously proposed PID formulas imply that (i) redundancy reduces to the minimum information provided by either source variable and hence is independent of correlation between sources, and (ii) synergy is the extra information contributed by the weaker source when the stronger source is known and can either increase or decrease with correlation between sources. We find that Gaussian systems frequently exhibit net synergy, i.e., the information carried jointly by both sources is greater than the sum of information carried by each source individually. Drawing from several explicit examples, we discuss the implications of these findings for measures of information transfer and information-based measures of complexity, both generally and within a neuroscience setting. Importantly, by providing independent formulas for synergy and redundancy applicable to continuous time-series data, we provide an approach to characterizing and quantifying information sharing amongst complex system variables.

[1]  J. Edeline,et al.  Cortical Inhibition Reduces Information Redundancy at Presentation of Communication Sounds in the Primary Auditory Cortex , 2013, The Journal of Neuroscience.

[2]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[3]  Mikhail Prokopenko,et al.  Differentiating information transfer and causal effect , 2008, 0812.4373.

[4]  Marcello Massimini,et al.  Directed Information Transfer in Scalp Electroencephalographic Recordings , 2014, Clinical EEG and neuroscience.

[5]  A. Seth,et al.  Causal density and integrated information as measures of conscious level , 2011, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[6]  L. Goddard Information Theory , 1962, Nature.

[7]  Ioannis Kontoyiannis,et al.  Mutual information, synergy and some curious phenomena for simple channels , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[8]  Daniele Marinazzo,et al.  Synergy and redundancy in the Granger causal analysis of dynamical networks , 2014, New Journal of Physics.

[9]  Jochen Kaiser,et al.  Transfer entropy in magnetoencephalographic data: quantifying information flow in cortical and cerebellar networks. , 2011, Progress in biophysics and molecular biology.

[10]  A. Seth,et al.  Multivariate Granger causality and generalized variance. , 2010, Physical review. E, Statistical, nonlinear, and soft matter physics.

[11]  Albert Y. Zomaya,et al.  Information modification and particle collisions in distributed computation. , 2010, Chaos.

[12]  Mikhail Prokopenko,et al.  An information-theoretic primer on complexity, self-organization, and emergence , 2009 .

[13]  Grégoire Nicolis,et al.  Proceedings of the European Conference on Complex Systems 2012: Springer Proceedings in Complexity , 2013 .

[14]  S. Bressler,et al.  Granger Causality: Basic Theory and Application to Neuroscience , 2006, q-bio/0608035.

[15]  Vadas Gintautas,et al.  Identification of functional information subgraphs in complex networks. , 2007, Physical review letters.

[16]  A. Seth,et al.  Granger causality and transfer entropy are equivalent for Gaussian variables. , 2009, Physical review letters.

[17]  Steven L. Bressler,et al.  Comments and Controversies Wiener – Granger Causality : A well established methodology , 2010 .

[18]  Giulio Tononi,et al.  Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework , 2008, PLoS Comput. Biol..

[19]  Michael I. Ham,et al.  Functional structure of cortical neuronal networks grown in vitro. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[20]  James P. Crutchfield,et al.  Intersection Information Based on Common Randomness , 2013, Entropy.

[21]  Anil K. Seth,et al.  Practical Measures of Integrated Information for Time-Series Data , 2011, PLoS Comput. Biol..

[22]  Jens Timmer,et al.  Handbook of Time Series Analysis , 2006 .

[23]  T. Bossomaier,et al.  Information flow in a kinetic Ising model peaks in the disordered phase. , 2013, Physical review letters.

[24]  Mikhail Prokopenko,et al.  An information-theoretic primer on complexity, self-organization, and emergence , 2009, Complex..

[25]  Anil K Seth,et al.  Theories and measures of consciousness: an extended framework. , 2006, Proceedings of the National Academy of Sciences of the United States of America.

[26]  Peter E. Latham,et al.  Mutual Information , 2006 .

[27]  P. Latham,et al.  Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.

[28]  Daniele Marinazzo,et al.  Redundant variables and Granger causality. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[29]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.

[30]  M. Shanahan Dynamical complexity in small-world networks of spiking neurons. , 2008, Physical review. E, Statistical, nonlinear, and soft matter physics.

[31]  Gordon Pipa,et al.  Transfer entropy—a model-free measure of effective connectivity for the neurosciences , 2010, Journal of Computational Neuroscience.

[32]  R. Rosenfeld Synergy , 2009, Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery.

[33]  Christoph Salge,et al.  A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[34]  Steven L. Bressler,et al.  Wiener–Granger Causality: A well established methodology , 2011, NeuroImage.

[35]  Christian K. Machens,et al.  Representation of Acoustic Communication Signals by Insect Auditory Receptor Neurons , 2001, The Journal of Neuroscience.

[36]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[37]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[38]  Sheila Nirenberg,et al.  Decoding neuronal spike trains: How important are correlations? , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[39]  Guorong Wu,et al.  Expanding the transfer entropy to identify information subgraphs in complex systems , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[40]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[41]  C. Granger Investigating causal relations by econometric models and cross-spectral methods , 1969 .

[42]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[43]  Viola Priesemann,et al.  Bits from Brains for Biologically Inspired Computing , 2014, Front. Robot. AI.

[44]  Virgil Griffith A Principled Infotheoretic \phi-like Measure , 2014 .

[45]  Karl J. Friston,et al.  Analysing connectivity with Granger causality and dynamic causal modelling , 2013, Current Opinion in Neurobiology.

[46]  William Bialek,et al.  Synergy in a Neural Code , 2000, Neural Computation.

[47]  David J. C. MacKay,et al.  Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.

[48]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.