Generalised Measures of Multivariate Information Content

The entropy of a pair of random variables is commonly depicted using a Venn diagram. This representation is potentially misleading, however, since the multivariate mutual information can be negative. This paper presents new measures of multivariate information content that can be accurately depicted using Venn diagrams for any number of random variables. These measures complement the existing measures of multivariate mutual information and are constructed by considering the algebraic structure of information sharing. It is shown that the distinct ways in which a set of marginal observers can share their information with a non-observing third party corresponds to the elements of a free distributive lattice. The redundancy lattice from partial information decomposition is then subsequently and independently derived by combining the algebraic structures of joint and shared information content.

[1]  Masud Mansuripur,et al.  Introduction to information theory , 1986 .

[2]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[3]  Benjamin Flecker,et al.  Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective , 2014, Journal of Computational Neuroscience.

[4]  Artemy Kolchinsky A novel approach to multivariate redundancy and synergy , 2019, ArXiv.

[5]  Eckehard Olbrich,et al.  On extractable shared information , 2017, Entropy.

[6]  D. Anastassiou Computational analysis of the synergy among multiple interacting genes , 2007, Molecular systems biology.

[7]  Nihat Ay,et al.  Hierarchical Quantification of Synergy in Channels , 2016, Front. Robot. AI.

[8]  Daniele Marinazzo,et al.  Synergy and redundancy in the Granger causal analysis of dynamical networks , 2014, New Journal of Physics.

[9]  Désirée White,et al.  Genotype–phenotype associations and human eye color , 2011, Journal of Human Genetics.

[10]  Randall D. Beer,et al.  Generalized Measures of Information Transfer , 2011, ArXiv.

[11]  Chung Chan,et al.  Multivariate Mutual Information Inspired by Secret-Key Agreement , 2015, Proceedings of the IEEE.

[12]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[13]  Michael Breakspear,et al.  Transitions in information processing dynamics at the whole-brain network level are driven by alterations in neural gain , 2019, PLoS Comput. Biol..

[14]  Viola Priesemann,et al.  Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition , 2017, Entropy.

[15]  Dirk Oliver Theis,et al.  Bivariate Partial Information Decomposition: The Optimization Perspective , 2017, Entropy.

[16]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[17]  Christoph Salge,et al.  A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[18]  George Loizou,et al.  The completion of a poset in a lattice of antichains , 2001 .

[19]  Keyan Zahedi,et al.  Morphological Computation: Synergy of Body and Brain , 2017, Entropy.

[20]  Michael Gastpar,et al.  Quantifying High-order Interdependencies via Multivariate Extensions of the Mutual Information , 2019, Physical review. E.

[21]  Joseph T. Lizier,et al.  Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).

[22]  Claude E. Shannon,et al.  The lattice theory of information , 1953, Trans. IRE Prof. Group Inf. Theory.

[23]  Hua Li,et al.  On a Connection between Information and Group Lattices , 2011, Entropy.

[24]  Tracey Ho,et al.  Quantifying Redundant Information in Predicting a Target Random Variable , 2014, Entropy.

[25]  Adam B. Barrett,et al.  An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.

[26]  L. L. CAMPBELL,et al.  Entropy as a measure , 1965, IEEE Trans. Inf. Theory.

[27]  Isaac Meilijson,et al.  Can single knockouts accurately single out gene functions? , 2008, BMC Systems Biology.

[28]  Amiel Feinstein,et al.  Information and information stability of random variables and processes , 1964 .

[29]  G. Edelman,et al.  A measure for brain complexity: relating functional segregation and integration in the nervous system. , 1994, Proceedings of the National Academy of Sciences of the United States of America.

[30]  Joseph T. Lizier,et al.  Probability Mass Exclusions and the Directed Components of Mutual Information , 2018, Entropy.

[31]  Jim Kay,et al.  Partial information decomposition as a unified approach to the specification of neural goal functions , 2015, Brain and Cognition.

[32]  Thalia E. Chan,et al.  Gene Regulatory Network Inference from Single-Cell Data Using Multivariate Information Measures , 2016, bioRxiv.

[33]  Raymond W. Yeung,et al.  Information Theory and Network Coding , 2008 .

[34]  Eckehard Olbrich,et al.  Coarse-Graining and the Blackwell Order , 2017, Entropy.

[35]  L. Comtet,et al.  Advanced Combinatorics: The Art of Finite and Infinite Expansions , 1974 .

[36]  J G Daugman,et al.  Information Theory and Coding , 1998 .

[37]  Joseph T. Lizier,et al.  Pointwise Partial Information DecompositionUsing the Specificity and Ambiguity Lattices , 2018, Entropy.

[38]  William J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[39]  S. Ross A First Course in Probability , 1977 .

[41]  James P. Crutchfield,et al.  Unique Information and Secret Key Agreement , 2018, Entropy.

[42]  Martín Ugarte,et al.  An Information-Theoretic Approach to Self-Organisation: Emergence of Complex Interdependencies in Coupled Dynamical Systems , 2018, Entropy.

[43]  N. J. A. Sloane,et al.  The On-Line Encyclopedia of Integer Sequences , 2003, Electron. J. Comb..

[44]  Eckehard Olbrich,et al.  Reconsidering unique information: Towards a multivariate information decomposition , 2014, 2014 IEEE International Symposium on Information Theory.

[45]  A. P. Beltyukov,et al.  On the amount of information , 2011, Pattern Recognition and Image Analysis.

[46]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[47]  Ayan Biswas Multivariate information processing characterizes fitness of a cascaded gene-transcription machinery. , 2019, Chaos.

[48]  Vasily A. Vakorin,et al.  Confounding effects of indirect connections on causality estimation , 2009, Journal of Neuroscience Methods.

[49]  Viola Priesemann,et al.  Bits from Brains for Biologically Inspired Computing , 2014, Front. Robot. AI.

[50]  Mark D. Plumbley,et al.  A measure of statistical complexity based on predictive information , 2010, ArXiv.

[51]  Joseph T. Lizier,et al.  IDTxl: The Information Dynamics Toolkit xl: a Python package for the efficient analysis of multivariate information dynamics in networks , 2018, J. Open Source Softw..

[52]  Maxym Myroshnychenko,et al.  High-Degree Neurons Feed Cortical Computations , 2016, PLoS Comput. Biol..

[53]  James P. Crutchfield,et al.  Multivariate Dependence Beyond Shannon Information , 2016, Entropy.

[54]  Suman K Banik,et al.  Redundancy in information transmission in a two-step cascade. , 2016, Physical review. E.

[55]  James P. Crutchfield,et al.  Intersection Information Based on Common Randomness , 2013, Entropy.

[56]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[57]  James P. Crutchfield,et al.  Unique information via dependency constraints , 2017, Journal of Physics A: Mathematical and Theoretical.

[58]  Jim Kay,et al.  Partial and Entropic Information Decompositions of a Neuronal Modulatory Interaction , 2017, Entropy.

[59]  Dirk Oliver Theis,et al.  Analyzing Information Distribution in Complex Systems , 2017, Entropy.

[60]  Jakob Heinzle,et al.  Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity , 2010, Journal of Computational Neuroscience.

[61]  G. Grätzer General Lattice Theory , 1978 .

[62]  Luca Faes,et al.  Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes , 2017, Entropy.

[63]  Marian Verhelst,et al.  Understanding Interdependency Through Complex Information Sharing , 2015, Entropy.

[64]  Angelika Bayer,et al.  A First Course In Probability , 2016 .

[65]  Pinaki Chaudhury,et al.  Information Theoretical Study of Cross-Talk Mediated Signal Transduction in MAPK Pathways , 2015, Entropy.

[66]  Malte Harder,et al.  Information driven self-organization of agents and agent collectives , 2014 .

[67]  Murray Shanahan,et al.  The Partial Information Decomposition of Generative Neural Network Models , 2017, Entropy.

[68]  R. Stanley Enumerative Combinatorics: Volume 1 , 2011 .

[69]  Te Sun Han,et al.  Linear Dependence Structure of the Entropy Space , 1975, Inf. Control..

[70]  Daniel Chicharro,et al.  Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables , 2017, Entropy.

[71]  Seth Frey,et al.  Information encryption in the expert management of strategic uncertainty , 2016, ArXiv.

[72]  Peter M. A. Sloot,et al.  Quantifying Synergistic Information Using Intermediate Stochastic Variables , 2016, Entropy.

[73]  Joseph T. Lizier,et al.  Quantifying Information Modification in Cellular Automata using Pointwise Partial Information Decomposition , 2018 .

[74]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[75]  Guorong Wu,et al.  Expanding the transfer entropy to identify information subgraphs in complex systems , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[76]  Raymond W. Yeung,et al.  A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.

[77]  R. M. Fano,et al.  The statistical theory of information , 1959 .

[78]  Joseph T. Lizier,et al.  Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing , 2019, Network Neuroscience.

[79]  Joseph T. Lizier,et al.  Information Decomposition of Target Effects from Multi-Source Interactions: Perspectives on Previous, Current and Future Work , 2018, Entropy.

[80]  John M Beggs,et al.  Partial information decomposition as a spatiotemporal filter. , 2011, Chaos.

[81]  Robin A. A. Ince The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal , 2017, ArXiv.

[82]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.

[83]  Jason Crampton Two partial orders on the set of antichains , 2001 .

[84]  Brian A. Davey,et al.  An Introduction to Lattices and Order , 1989 .

[85]  Johannes Rauh,et al.  Secret Sharing and Shared Information , 2017, Entropy.

[86]  David J. C. MacKay,et al.  Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.

[87]  Robin A. A. Ince Measuring multivariate redundant information with pointwise common change in surprisal , 2016, Entropy.

[88]  Eckehard Olbrich,et al.  Information Decomposition and Synergy , 2015, Entropy.