A measure of statistical complexity based on predictive information

We introduce an information theoretic measure of statistical structure, called 'binding information', for sets of random variables, and compare it with several previously proposed measures including excess entropy, Bialek et al.'s predictive information, and the multi-information. We derive some of the properties of the binding information, particularly in relation to the multi-information, and show that, for finite sets of binary random variables, the processes which maximises binding information are the 'parity' processes. Finally we discuss some of the implications this has for the use of the binding information as a measure of complexity.

[1]  Young,et al.  Inferring statistical complexity. , 1989, Physical review letters.

[2]  Naftali Tishby,et al.  Predictability, Complexity, and Learning , 2000, Neural Computation.

[3]  Te Sun Han Nonnegative Entropy Measures of Multivariate Symmetric Correlations , 1978, Inf. Control..

[4]  M. V. Rossum,et al.  In Neural Computation , 2022 .

[5]  P. Grassberger Toward a quantitative theory of self-generated complexity , 1986 .

[6]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[7]  A. Mackay On complexity , 2001 .

[8]  J. Crutchfield,et al.  Measures of statistical complexity: Why? , 1998 .

[9]  Sergio Verdú,et al.  Universal Estimation of Erasure Entropy , 2009, IEEE Transactions on Information Theory.

[10]  S. Kirkpatrick,et al.  Solvable Model of a Spin-Glass , 1975 .

[11]  J. Crutchfield,et al.  Statistical complexity of simple one-dimensional spin systems , 1997, cond-mat/9702191.

[12]  Michael I. Jordan Learning in Graphical Models , 1999, NATO ASI Series.

[13]  N. Ay,et al.  Multi-Information in the Thermodynamic Limit , 2004 .

[14]  Ricardo López-Ruiz,et al.  A Statistical Measure of Complexity , 1995, ArXiv.

[15]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[16]  N. Packard,et al.  Symbolic dynamics of noisy chaos , 1983 .

[17]  Naftali Tishby,et al.  Predictive Information , 1999, cond-mat/9902341.

[18]  J. Crutchfield The calculi of emergence: computation, dynamics and induction , 1994 .

[19]  Oscar H. IBARm Information and Control , 1957, Nature.

[20]  P. Grassberger Finite sample corrections to entropy and dimension estimates , 1988 .

[21]  William J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[22]  W. H. Zurek Complexity, Entropy and the Physics of Information , 1990 .

[23]  P. Landsberg,et al.  Simple measure for complexity , 1999 .

[24]  Yang Xiang,et al.  Critical Remarks on Single Link Search in Learning Belief Networks , 1996, UAI.

[25]  Mark D. Plumbley,et al.  predictive information , multi-information , and binding information , 2010 .

[26]  L. Williams,et al.  Contents , 2020, Ophthalmology (Rochester, Minn.).

[27]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[28]  Physics Letters , 1962, Nature.

[29]  L. Goddard Information Theory , 1962, Nature.

[30]  Mark D. Plumbley,et al.  Information dynamics: patterns of expectation and surprise in the perception of music , 2009, Connect. Sci..

[31]  Mike Mannion,et al.  Complex systems , 1997, Proceedings International Conference and Workshop on Engineering of Computer-Based Systems.

[32]  R. J. Joenk,et al.  IBM journal of research and development: information for authors , 1978 .

[33]  October I Physical Review Letters , 2022 .

[34]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[35]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .

[36]  Crutchfield,et al.  Comment I on "Simple measure for complexity" , 1999, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.