Understanding Interdependency Through Complex Information Sharing

The interactions between three or more random variables are often nontrivial, poorly understood, and yet, are paramount for future advances in fields such as network information theory, neuroscience, genetics and many others. In this work, we propose to analyze these interactions as different modes of information sharing. Towards this end, we introduce a novel axiomatic framework for decomposing the joint entropy, which characterizes the various ways in which random variables can share information. The key contribution of our framework is to distinguish between interdependencies where the information is shared redundantly, and synergistic interdependencies where the sharing structure exists in the whole but not between the parts. We show that our axioms determine unique formulas for all the terms of the proposed decomposition for a number of cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits.

[1]  Claude E. Shannon,et al.  Communication theory of secrecy systems , 1949, Bell Syst. Tech. J..

[2]  Yaneer Bar-Yam,et al.  Multiscale Complexity/Entropy , 2004, Adv. Complex Syst..

[3]  L. Brillouin,et al.  The Negentropy Principle of Information , 1953 .

[4]  Raymond W. Yeung,et al.  A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.

[5]  William Bialek,et al.  Statistical Mechanics of the US Supreme Court , 2013, Journal of Statistical Physics.

[6]  Marian Verhelst,et al.  Understanding high-order correlations using a synergy-based decomposition of the total entropy , 2015 .

[7]  M. Vetterli,et al.  Sensing reality and communicating bits: a dangerous liaison , 2006, IEEE Signal Processing Magazine.

[8]  Matthieu R. Bloch,et al.  Physical-Layer Security: From Information Theory to Security Engineering , 2011 .

[9]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[10]  Robert L. Wolpert,et al.  Statistical Inference , 2019, Encyclopedia of Social Network Analysis and Mining.

[11]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[12]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[13]  T. Ho,et al.  On Linear Network Coding , 2010 .

[14]  B. Cipra An introduction to the Ising model , 1987 .

[15]  Pietro Perona,et al.  Quantifying synergistic information , 2014 .

[16]  Charles Perrings,et al.  Economy and environment : a theoretical essay on the interdependence of economic and environmental systems , 1987 .

[17]  Kathryn B. Laskey,et al.  Neural Coding: Higher-Order Temporal Patterns in the Neurostatistics of Cell Assemblies , 2000, Neural Computation.

[18]  Wentian Li Mutual information functions versus correlation functions , 1990 .

[19]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[20]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[21]  Peter E. Latham,et al.  Pairwise Maximum Entropy Models for Studying Large Biological Systems: When They Can Work and When They Can't , 2008, PLoS Comput. Biol..

[22]  Michael J. Berry,et al.  Weak pairwise correlations imply strongly correlated network states in a neural population , 2005, Nature.

[23]  W. Bialek,et al.  Statistical mechanics for natural flocks of birds , 2011, Proceedings of the National Academy of Sciences.

[24]  Ginestra Bianconi,et al.  Entropy measures for networks: toward an information theory of complex topologies. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[25]  James P. Crutchfield,et al.  Intersection Information Based on Common Randomness , 2013, Entropy.

[26]  Lina Merchan,et al.  On the suciency of pairwise interactions in maximum entropy models of biological networks , 2015 .

[27]  P. Senge The Necessary Revolution: How Individuals and Organizations Are Working Together to Create a Sustainable World , 2008 .

[28]  Bryan C. Daniels,et al.  Sparse code of conflict in a primate society , 2012, Proceedings of the National Academy of Sciences.

[29]  J. Crutchfield,et al.  Regularities unseen, randomness observed: levels of entropy convergence. , 2001, Chaos.

[30]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.

[31]  James P. Crutchfield,et al.  Anatomy of a Bit: Information in a Time Series Observation , 2011, Chaos.

[32]  Christoph Salge,et al.  A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[33]  S. Haykin Adaptive Filters , 2007 .

[34]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[35]  Shun-ichi Amari,et al.  Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.

[36]  Judea Pearl,et al.  A Computational Model for Causal and Diagnostic Reasoning in Inference Systems , 1983, IJCAI.

[37]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[38]  Eckehard Olbrich,et al.  How should complexity scale with system size? , 2008 .

[39]  Adam B. Barrett,et al.  An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.

[40]  A. J. Bell THE CO-INFORMATION LATTICE , 2003 .

[41]  Michael J. Berry,et al.  Network information and connected correlations. , 2003, Physical review letters.

[42]  Tsachy Weissman,et al.  Justification of Logarithmic Loss via the Benefit of Side Information , 2014, IEEE Transactions on Information Theory.

[43]  Muriel Médard,et al.  XORs in the Air: Practical Wireless Network Coding , 2006, IEEE/ACM Transactions on Networking.

[44]  William J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[45]  Rudolf Ahlswede,et al.  Network information flow , 2000, IEEE Trans. Inf. Theory.

[46]  M. Tribus,et al.  Probability theory: the logic of science , 2003 .

[47]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[48]  Eckehard Olbrich,et al.  Information Decomposition and Synergy , 2015, Entropy.

[49]  Kunihiko Kaneko,et al.  Life: An Introduction to Complex Systems Biology , 2006 .

[50]  Isaac Meilijson,et al.  Can single knockouts accurately single out gene functions? , 2008, BMC Systems Biology.

[51]  Te Sun Han Nonnegative Entropy Measures of Multivariate Symmetric Correlations , 1978, Inf. Control..

[52]  J BERKSON,et al.  Limitations of the application of fourfold table analysis to hospital data. , 1946, Biometrics.

[53]  D. Haar,et al.  Statistical Physics , 1971, Nature.

[54]  M. Studený,et al.  The Multiinformation Function as a Tool for Measuring Stochastic Dependence , 1998, Learning in Graphical Models.

[55]  Ilya Nemenman,et al.  On the Sufficiency of Pairwise Interactions in Maximum Entropy Models of Networks , 2015, Journal of Statistical Physics.