Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective

Information theory has long been used to quantify interactions between two variables. With the rise of complex systems research, multivariate information measures have been increasingly used to investigate interactions between groups of three or more variables, often with an emphasis on so called synergistic and redundant interactions. While bivariate information measures are commonly agreed upon, the multivariate information measures in use today have been developed by many different groups, and differ in subtle, yet significant ways. Here, we will review these multivariate information measures with special emphasis paid to their relationship to synergy and redundancy, as well as examine the differences between these measures by applying them to several simple model systems. In addition to these systems, we will illustrate the usefulness of the information measures by analyzing neural spiking data from a dissociated culture through early stages of its development. Our aim is that this work will aid other researchers as they seek the best multivariate information measure for their specific research goals and system. Finally, we have made software available online which allows the user to calculate all of the information measures discussedwithin this paper.

[1]  William Bialek,et al.  Entropy and information in neural spike trains: progress on the sampling problem. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[2]  John M. Beggs,et al.  A Maximum Entropy Model Applied to Spatial and Temporal Correlations from Cortical Networks In Vitro , 2008, The Journal of Neuroscience.

[3]  Steve M. Potter,et al.  An extremely rich repertoire of bursting patterns during the development of cortical cultures , 2006, BMC Neuroscience.

[4]  R. Rosenfeld Nature , 2009, Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery.

[5]  R. K. Simpson Nature Neuroscience , 2022 .

[6]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[7]  Eckehard Olbrich,et al.  How should complexity scale with system size? , 2008 .

[8]  David M. Miller,et al.  Computational inference of the molecular logic for synaptic connectivity in C. elegans , 2006, ISMB.

[9]  Ifije E. Ohiorhenuan,et al.  Sparse coding and high-order correlations in fine-scale cortical networks , 2010, Nature.

[10]  G Tononi,et al.  Theoretical neuroanatomy: relating anatomical and functional connectivity in graphs and cortical connection matrices. , 2000, Cerebral cortex.

[11]  HE Ixtroductiont,et al.  The Bell System Technical Journal , 2022 .

[12]  D. Wilkin,et al.  Neuron , 2001, Brain Research.

[13]  廣瀬雄一,et al.  Neuroscience , 2019, Workplace Attachments.

[14]  Hagai Bergman,et al.  Local shuffling of spike trains boosts the accuracy of spike train spectral analysis. , 2006, Journal of neurophysiology.

[15]  Eero P. Simoncelli,et al.  Spatio-temporal correlations and visual signalling in a complete neuronal population , 2008, Nature.

[16]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[17]  D A Butts,et al.  The Information Content of Spontaneous Retinal Waves , 2001, The Journal of Neuroscience.

[18]  Shun-ichi Amari,et al.  Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.

[19]  Jonathon Shlens,et al.  The Structure of Multi-Neuron Firing Patterns in Primate Retina , 2006, The Journal of Neuroscience.

[20]  Te Sun Han Nonnegative Entropy Measures of Multivariate Symmetric Correlations , 1978, Inf. Control..

[21]  Asohan Amarasingham,et al.  At what time scale does the nervous system operate? , 2003, Neurocomputing.

[22]  Kathryn B. Laskey,et al.  Neural Coding: Higher-Order Temporal Patterns in the Neurostatistics of Cell Assemblies , 2000, Neural Computation.

[23]  Tim Gollisch,et al.  Rapid Neural Coding in the Retina with Relative Spike Latencies , 2008, Science.

[24]  Steve M. Potter,et al.  Plasticity of recurring spatiotemporal activity patterns in cortical networks , 2007, Physical biology.

[25]  N. J. Cerf,et al.  Entropic Bell inequalities , 1997 .

[26]  A. Pühler,et al.  Molecular systems biology , 2007 .

[27]  Fraser,et al.  Independent coordinates for strange attractors from mutual information. , 1986, Physical review. A, General physics.

[28]  Eric Shea-Brown,et al.  Information theoretic approaches to understanding circuit function , 2012, Current Opinion in Neurobiology.

[29]  José Carlos Príncipe,et al.  A comparison of binless spike train measures , 2010, Neural Computing and Applications.

[30]  John M. Beggs,et al.  Behavioral / Systems / Cognitive Neuronal Avalanches Are Diverse and Precise Activity Patterns That Are Stable for Many Hours in Cortical Slice Cultures , 2004 .

[31]  R. Quiroga,et al.  Extracting information from neuronal populations : information theory and decoding approaches , 2022 .

[32]  Mark D. Plumbley,et al.  A measure of statistical complexity based on predictive information , 2010, ArXiv.

[33]  Shan Yu,et al.  Higher-Order Interactions Characterized in Cortical Activity , 2011, The Journal of Neuroscience.

[34]  T. Bonhoeffer,et al.  Current opinion in neurobiology , 1997, Current Opinion in Neurobiology.

[35]  H. Robinson,et al.  Spontaneous periodic synchronized bursting during formation of mature patterns of connections in cortical cultures , 1996, Neuroscience Letters.

[36]  P. Latham,et al.  Retinal ganglion cells act largely as independent encoders , 2001, Nature.

[37]  Jonathan D Victor,et al.  Approaches to Information-Theoretic Analysis of Neural Activity , 2006, Biological theory.

[38]  Sonja Grün,et al.  Effectiveness of systematic spike dithering depends on the precision of cortical synchronization , 2008, Brain Research.

[39]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[40]  Naftali Tishby,et al.  The minimum information principle and its application to neural code analysis , 2009, Proceedings of the National Academy of Sciences.

[41]  Florentin Wörgötter,et al.  Self-Organized Criticality in Developing Neuronal Networks , 2010, PLoS Comput. Biol..

[42]  Gordon Pipa,et al.  Transfer entropy—a model-free measure of effective connectivity for the neurosciences , 2010, Journal of Computational Neuroscience.

[43]  J. Szentágothai,et al.  Brain Research , 2009, Experimental Neurology.

[44]  Olaf Sporns,et al.  Network structure of cerebral cortex shapes functional connectivity on multiple time scales , 2007, Proceedings of the National Academy of Sciences.

[45]  김삼묘,et al.  “Bioinformatics” 특집을 내면서 , 2000 .

[46]  L. L. Bologna,et al.  Self-organization and neuronal avalanches in networks of dissociated cortical neurons , 2008, Neuroscience.

[47]  Axthonv G. Oettinger,et al.  IEEE Transactions on Information Theory , 1998 .

[48]  H E M Journal of Neurophysiology , 1938, Nature.

[49]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[50]  Stefano Panzeri,et al.  Analytical estimates of limited sampling biases in different information measures. , 1996, Network.

[51]  A. Dunker The pacific symposium on biocomputing , 1998 .

[52]  P. Latham,et al.  Synergy, Redundancy, and Independence in Population Codes, Revisited , 2005, The Journal of Neuroscience.

[53]  James P. Crutchfield,et al.  Anatomy of a Bit: Information in a Time Series Observation , 2011, Chaos.

[54]  Inés Samengo,et al.  Spike-timing precision underlies the coding efficiency of auditory receptor neurons. , 2006, Journal of neurophysiology.

[55]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[56]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[57]  R. J. Joenk,et al.  IBM journal of research and development: information for authors , 1978 .

[58]  R. Quian Quiroga Principles of neural coding. , 2011, Current biology : CB.

[59]  M R DeWeese,et al.  How to measure the information gained from one symbol. , 1999, Network.

[60]  K. Hlavácková-Schindler,et al.  Causality detection based on information-theoretic approaches in time series analysis , 2007 .

[61]  J. Rogers Chaos , 1876 .

[62]  M. Diamond,et al.  The Role of Spike Timing in the Coding of Stimulus Location in Rat Somatosensory Cortex , 2001, Neuron.

[63]  Sergio Martinoia,et al.  Evaluation of the Performance of Information Theory-Based Methods and Cross-Correlation to Estimate the Functional Connectivity in Cortical Networks , 2009, PloS one.

[64]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[65]  R. Quiroga,et al.  Principles of neural coding. , 2013 .

[66]  John M. Beggs,et al.  Extending Transfer Entropy Improves Identification of Effective Connectivity in a Spiking Cortical Network Model , 2011, PloS one.

[67]  S. Takada,et al.  Effects of macromolecular crowding on protein folding and aggregation studied by density functional theory: statics. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[68]  Abraham Lempel,et al.  A universal algorithm for sequential data compression , 1977, IEEE Trans. Inf. Theory.

[69]  Michael I. Ham,et al.  Functional structure of cortical neuronal networks grown in vitro. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[70]  Naftali Tishby,et al.  Synergy and Redundancy among Brain Cells of Behaving Monkeys , 1998, NIPS.

[71]  Emery N. Brown,et al.  State-Space Analysis of Time-Varying Higher-Order Spike Correlation for Multiple Neural Spike Train Data , 2012, PLoS Comput. Biol..

[72]  Te Sun Han,et al.  Linear Dependence Structure of the Entropy Space , 1975, Inf. Control..

[73]  William Bialek,et al.  Reading a Neural Code , 1991, NIPS.

[74]  A. Pouget,et al.  Neural correlations, population coding and computation , 2006, Nature Reviews Neuroscience.

[75]  Aonan Tang,et al.  Maximum Entropy Approaches to Living Neural Networks , 2010, Entropy.

[76]  Joseph T. Lizier,et al.  Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).

[77]  Shun-ichi Amari,et al.  Information geometry of the EM and em algorithms for neural networks , 1995, Neural Networks.

[78]  M. V. Rossum,et al.  In Neural Computation , 2022 .

[79]  G. Edelman,et al.  A measure for brain complexity: relating functional segregation and integration in the nervous system. , 1994, Proceedings of the National Academy of Sciences of the United States of America.

[80]  L. Optican,et al.  Temporal encoding of two-dimensional patterns by single units in primate inferior temporal cortex. III. Information theoretic analysis. , 1987, Journal of neurophysiology.

[81]  Maoz Shamir,et al.  Cortical Discrimination of Complex Natural Stimuli: Can Single Neurons Match Behavior? , 2007, The Journal of Neuroscience.

[82]  H. Kantz,et al.  Analysing the information flow between financial time series , 2002 .

[83]  Physical Review , 1965, Nature.

[84]  Benjamin Flecker,et al.  Multivariate information measures: an experimentalist's perspective , 2011, 1111.6857.

[85]  Jonathon Shlens,et al.  Estimating Information Rates with Confidence Intervals in Neural Spike Trains , 2007, Neural Computation.

[86]  William Bialek,et al.  Synergy in a Neural Code , 2000, Neural Computation.

[87]  Bernie Mulgrew,et al.  Proceedings IEEE International Conference on Communications , 1989 .

[88]  W. J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[89]  Jonathon Shlens,et al.  Estimating Entropy Rates with Bayesian Confidence Intervals , 2005, Neural Computation.

[90]  Ivan Bratko,et al.  Quantifying and Visualizing Attribute Interactions , 2003, ArXiv.

[91]  Yuji Ikegaya,et al.  Synfire Chains and Cortical Songs: Temporal Modules of Cortical Activity , 2004, Science.

[92]  Steve M. Potter,et al.  Persistent dynamic attractors in activity patterns of cultured neuronal networks. , 2006, Physical review. E, Statistical, nonlinear, and soft matter physics.

[93]  Pamela Reinagel,et al.  Decoding visual information from a population of retinal ganglion cells. , 1997, Journal of neurophysiology.

[94]  G. Buzsáki,et al.  Behavior-dependent short-term assembly dynamics in the medial prefrontal cortex , 2008, Nature Neuroscience.

[95]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[96]  October I Physical Review Letters , 2022 .

[97]  Chun-I Yeh,et al.  Temporal precision in the neural code and the timescales of natural vision , 2007, Nature.

[98]  Randall D. Beer,et al.  Generalized Measures of Information Transfer , 2011, ArXiv.

[99]  M. Wilson,et al.  Temporally Structured Replay of Awake Hippocampal Ensemble Activity during Rapid Eye Movement Sleep , 2001, Neuron.

[100]  O. Bagasra,et al.  Proceedings of the National Academy of Sciences , 1914, Science.

[101]  E. G. Jones Cerebral Cortex , 1987, Cerebral Cortex.

[102]  Michael J. Berry,et al.  Weak pairwise correlations imply strongly correlated network states in a neural population , 2005, Nature.

[103]  Matsuda,et al.  Physical nature of higher-order mutual information: intrinsic correlations and frustration , 2000, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[104]  Alexander Borst,et al.  Information theory and neural coding , 1999, Nature Neuroscience.

[105]  Stefano Panzeri,et al.  Correcting for the sampling bias problem in spike train information measures. , 2007, Journal of neurophysiology.

[106]  Jakob Heinzle,et al.  Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity , 2010, Journal of Computational Neuroscience.

[107]  Jonathan D. Victor,et al.  Information-geometric measure of 3-neuron firing patterns characterizes scale-dependence in cortical networks , 2011, Journal of Computational Neuroscience.

[108]  BMC Neuroscience , 2003 .

[109]  Stefano Panzeri,et al.  The Upward Bias in Measures of Information Derived from Limited Data Samples , 1995, Neural Computation.

[110]  C. Lee Giles,et al.  Neural Information Processing Systems 7 , 1995 .

[111]  Oscar H. IBARm Information and Control , 1957, Nature.

[112]  John M Beggs,et al.  Partial information decomposition as a spatiotemporal filter. , 2011, Chaos.

[113]  Vadas Gintautas,et al.  Identification of functional information subgraphs in complex networks. , 2007, Physical review letters.

[114]  Steve M. Potter,et al.  Precisely timed spatiotemporal patterns of neural activity in dissociated cortical cultures , 2007, Neuroscience.

[115]  Michael J. Berry,et al.  Network information and connected correlations. , 2003, Physical review letters.

[116]  Aidong Zhang,et al.  Information-theoretic metrics for visualizing gene-environment interactions. , 2007, American journal of human genetics.

[117]  Michael J. Berry,et al.  Synergy, Redundancy, and Independence in Population Codes , 2003, The Journal of Neuroscience.