Information of interactions in complex systems

This paper addresses misconceptions of the multi-variate interaction-information measure Q, which several authors have reinvented since its proposal by McGill (1954), giving it a variety of names and interpretations. McGill’s measure claimed to quantify the amount of information of interactions among three or more variables in complex systems. In (Krippendorff, 1980), I raised doubts about the validity of Q and its relatives. The chief problem that Q-measures fail to recognize is that complex interactions tend to involve circularities and the probability distributions characterizing such circularities cannot be obtained by products of probabilities, which underlie information theory as far as developed by Shannon (Shannon & Weaver, 1949). I argued that Q-measures are mere arithmetic artifacts, and proposed an algorithmic solution to measuring the amount of information in the interactions within complex systems, now widely accepted. The paper responds to Leydesdorff ’s (2009) “Interaction information: Linear and nonlinear interpretations,” published in the current issue of this journal and preceding discussions of these issues on the Cybernetics Discussion Group CYBCOM and personal correspondence involving Jakulin (2009). It prefers to rely on demonstrations with numerical data over abstract interpretations of mathematical forms that can so easily entrap scholars into believing that they measure something real without considering evidence to the contrary.

[1]  Raymond W. Yeung,et al.  Information Theory and Network Coding , 2008 .

[2]  J G Daugman,et al.  Information Theory and Coding , 1998 .

[3]  William J. McGill Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.

[4]  Klaus Krippendorff,et al.  Information Theory: Structural Models for Qualitative Data. , 1988 .

[5]  Matsuda,et al.  Physical nature of higher-order mutual information: intrinsic correlations and frustration , 2000, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[6]  W. R. Garner,et al.  The relation between information and variance analyses , 1956 .

[7]  David J. C. MacKay,et al.  Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.

[8]  Michael Satosi Watanabe,et al.  Information Theoretical Analysis of Multivariate Correlation , 1960, IBM J. Res. Dev..

[9]  Henry Etzkowi,et al.  The Triple Helix of University - Industry - Government , 2002 .

[10]  L. Leydesdorff,et al.  The Triple Helix of university-industry-government relations , 2003, Scientometrics.

[11]  A. J. Bell THE CO-INFORMATION LATTICE , 2003 .

[12]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[13]  W. R. Garner Uncertainty and structure as psychological concepts , 1975 .

[14]  Viktor Mikhaĭlovich Glushkov,et al.  An Introduction to Cybernetics , 1957, The Mathematical Gazette.

[15]  Loet Leydesdorff Interaction information: linear and nonlinear interpretations , 2009, Int. J. Gen. Syst..

[16]  Diana Lucio-Arias,et al.  The dynamics of exchanges and references among scientific texts, and the autopoiesis of discursive knowledge , 2009, J. Informetrics.

[17]  Klaus Krippendorff,et al.  Ross Ashby's information theory: a bit of history, some solutions to problems, and what we face today , 2009, Int. J. Gen. Syst..

[18]  H. Quastler Information theory in psychology : problems and methods , 1955 .

[19]  D. H. Freeman Statistical Decomposition Analysis , 1974 .