dit: a Python package for discrete information theory

dit(“Dit: A Python Package for Discrete Information Theory. Available at: Https://Github.com/Dit/Dit” n.d.) is a Python package for the study of discrete information theory. Information theory is a mathematical framework for the study of quantifying, compressing, and communicating random variables (Cover and Thomas 2006)(MacKay 2003)(Yeung 2008). More recently, information theory has been utilized within the physical and social sciences to quantify how different components of a system interact. dit is primarily concerned with this aspect of the theory.

[1]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[2]  Renato Renner,et al.  A new measure for conditional mutual information and its properties , 2003, IEEE International Symposium on Information Theory, 2003. Proceedings..

[3]  Randall D. Beer,et al.  Generalized Measures of Information Transfer , 2011, ArXiv.

[4]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.

[5]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[6]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[7]  Ueli Maurer,et al.  The intrinsic conditional mutual information and perfect secrecy , 1997, Proceedings of IEEE International Symposium on Information Theory.

[8]  Yaneer Bar-Yam,et al.  Multiscale Complexity/Entropy , 2004, Adv. Complex Syst..

[9]  Haim H. Permuter,et al.  Coordination Capacity , 2009, IEEE Transactions on Information Theory.

[10]  Tsachy Weissman,et al.  The Information Lost in Erasures , 2008, IEEE Transactions on Information Theory.

[11]  Marian Verhelst,et al.  Understanding Interdependency Through Complex Information Sharing , 2015, Entropy.

[12]  Prakash Narayan,et al.  When Is a Function Securely Computable? , 2011, IEEE Trans. Inf. Theory.

[13]  Mark D. Plumbley,et al.  A measure of statistical complexity based on predictive information , 2010, ArXiv.

[14]  Yunmei Chen,et al.  Cumulative residual entropy: a new measure of information , 2004, IEEE Transactions on Information Theory.

[15]  Naftali Tishby,et al.  The information bottleneck method , 2000, ArXiv.

[16]  James P. Crutchfield,et al.  Intersection Information Based on Common Randomness , 2013, Entropy.

[17]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[18]  Renato Renner,et al.  A property of the intrinsic mutual information , 2003, IEEE International Symposium on Information Theory, 2003. Proceedings..

[19]  Venkat Anantharam,et al.  Information-Theoretic Key Agreement of Multiple Terminals—Part I , 2010, IEEE Transactions on Information Theory.

[20]  A. J. Bell THE CO-INFORMATION LATTICE , 2003 .

[21]  Michael J. Berry,et al.  Network information and connected correlations. , 2003, Physical review letters.

[22]  Daniel Pérez Palomar,et al.  Lautum Information , 2008, IEEE Transactions on Information Theory.

[23]  Te Sun Han,et al.  Linear Dependence Structure of the Entropy Space , 1975, Inf. Control..

[24]  Daniel Chicharro,et al.  Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables , 2017, Entropy.

[25]  Abbas El Gamal,et al.  Exact common information , 2014, 2014 IEEE International Symposium on Information Theory.

[26]  Yaneer Bar-Yam,et al.  An Information-Theoretic Formalism for Multiscale Structure in Complex Systems , 2014, 1409.4708.

[27]  Klaus Krippendorff,et al.  Ross Ashby's information theory: a bit of history, some solutions to problems, and what we face today , 2009, Int. J. Gen. Syst..

[28]  Frank Lad,et al.  Extropy: a complementary dual of entropy , 2011, ArXiv.

[29]  Salman Beigi,et al.  $\Phi$ -Entropic Measures of Correlation , 2016, IEEE Transactions on Information Theory.

[30]  Chung Chan,et al.  Multivariate Mutual Information Inspired by Secret-Key Agreement , 2015, Proceedings of the IEEE.

[31]  Christoph Salge,et al.  A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[32]  Himanshu Tyagi,et al.  When Is a Function Securely Computable? , 2010, IEEE Transactions on Information Theory.

[33]  Praveen Kumar,et al.  Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables , 2017 .

[34]  Yaneer Bar-Yam,et al.  Multiscale Information Theory and the Marginal Utility of Information , 2017, Entropy.

[35]  Eckehard Olbrich,et al.  On extractable shared information , 2017, Entropy.

[36]  David J. C. MacKay,et al.  Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.

[37]  Robin A. A. Ince Measuring multivariate redundant information with pointwise common change in surprisal , 2016, Entropy.

[38]  Francisco J. Valverde-Albacete,et al.  The Multivariate Entropy Triangle and Applications , 2016, HAIS.

[39]  Eckehard Olbrich,et al.  Information Decomposition and Synergy , 2015, Entropy.

[40]  Eckehard Olbrich,et al.  Reconsidering unique information: Towards a multivariate information decomposition , 2014, 2014 IEEE International Symposium on Information Theory.

[41]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[42]  Te Sun Han,et al.  Multiple Mutual Informations and Multiple Interactions in Frequency Data , 1980, Inf. Control..

[43]  Shun-ichi Amari,et al.  Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.

[44]  Wei Liu,et al.  The common information of N dependent random variables , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[45]  Raymond W. Yeung,et al.  Information Theory and Network Coding , 2008 .

[46]  William J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[47]  Joseph T. Lizier,et al.  Towards a synergy-based approach to measuring information modification , 2013, 2013 IEEE Symposium on Artificial Life (ALife).

[48]  G. Edelman,et al.  A measure for brain complexity: relating functional segregation and integration in the nervous system. , 1994, Proceedings of the National Academy of Sciences of the United States of America.

[49]  Seth Frey,et al.  Information encryption in the expert management of strategic uncertainty , 2016, ArXiv.

[50]  Cristian S. Calude Information and Randomness: An Algorithmic Perspective , 1994 .

[51]  Osvaldo A. Rosso,et al.  Intensive entropic non-triviality measure , 2004 .

[52]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .