Modes of Information Flow

Author(s): James, Ryan G; Ayala, Blanca Daniella Mansante; Zakirov, Bahti; Crutchfield, James P | Abstract: Information flow between components of a system takes many forms and is key to understanding the organization and functioning of large-scale, complex systems. We demonstrate three modalities of information flow from time series X to time series Y. Intrinsic information flow exists when the past of X is individually predictive of the present of Y, independent of Y's past; this is most commonly considered information flow. Shared information flow exists when X's past is predictive of Y's present in the same manner as Y's past; this occurs due to synchronization or common driving, for example. Finally, synergistic information flow occurs when neither X's nor Y's pasts are predictive of Y's present on their own, but taken together they are. The two most broadly-employed information-theoretic methods of quantifying information flow---time-delayed mutual information and transfer entropy---are both sensitive to a pair of these modalities: time-delayed mutual information to both intrinsic and shared flow, and transfer entropy to both intrinsic and synergistic flow. To quantify each mode individually we introduce our cryptographic flow ansatz, positing that intrinsic flow is synonymous with secret key agreement between X and Y. Based on this, we employ an easily-computed secret-key-agreement bound---intrinsic mutual informationamdashto quantify the three flow modalities in a variety of systems including asymmetric flows and financial markets.

[1]  Renato Renner,et al.  A property of the intrinsic mutual information , 2003, IEEE International Symposium on Information Theory, 2003. Proceedings..

[2]  Raymond W. Yeung,et al.  A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.

[3]  U. Maurer,et al.  Secret key agreement by public discussion from common information , 1993, IEEE Trans. Inf. Theory.

[4]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[5]  Feller William,et al.  An Introduction To Probability Theory And Its Applications , 1950 .

[6]  Jaroslav Hlinka,et al.  Causal network discovery by iterative conditioning: comparison of algorithms , 2018, Chaos.

[7]  Nigel Williams,et al.  STRANGE ATTRACTORS , 2019, Chaos and Dynamical Systems.

[8]  J. Crutchfield,et al.  Regularities unseen, randomness observed: levels of entropy convergence. , 2001, Chaos.

[9]  Ueli Maurer,et al.  Information-Theoretic Secret-Key Agreement: The Asymptotically Tight Relation Between the Secret-Key Rate and the Channel Quality Ratio , 2018, IACR Cryptol. ePrint Arch..

[10]  Claude E. Shannon,et al.  The lattice theory of information , 1953, Trans. IRE Prof. Group Inf. Theory.

[11]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[12]  H. Kantz,et al.  Nonlinear time series analysis , 1997 .

[13]  Ueli Maurer,et al.  Unconditionally Secure Key Agreement and the Intrinsic Conditional Information , 1999, IEEE Trans. Inf. Theory.

[14]  Viktor Mikhaĭlovich Glushkov,et al.  An Introduction to Cybernetics , 1957, The Mathematical Gazette.

[15]  K. Kaneko Lyapunov analysis and information flow in coupled map lattices , 1986 .

[16]  James P. Crutchfield,et al.  Information Flows? A Critique of Transfer Entropies , 2015, Physical review letters.

[17]  Raymond W. Yeung,et al.  Information Theory and Network Coding , 2008 .

[18]  L. Goddard Information Theory , 1962, Nature.

[19]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[20]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[21]  R. Ingarden,et al.  Information without probability , 1962 .

[22]  James P. Crutchfield,et al.  dit: a Python package for discrete information theory , 2018, J. Open Source Softw..

[23]  S. Kullback,et al.  Information Theory and Statistics , 1959 .

[24]  Joseph T. Lizier,et al.  An Introduction to Transfer Entropy , 2016, Springer International Publishing.

[25]  James P. Crutchfield,et al.  Anatomy of a Bit: Information in a Time Series Observation , 2011, Chaos.

[26]  X. San Liang,et al.  The Liang-Kleeman Information Flow: Theory and Applications , 2013, Entropy.

[27]  J. Crutchfield Between order and chaos , 2011, Nature Physics.

[28]  Amin Gohari,et al.  On Achieving a Positive Rate in the Source Model Key Agreement Problem , 2017, 2018 IEEE International Symposium on Information Theory (ISIT).

[29]  Henrik Sandberg,et al.  Second-law-like inequalities with information and their interpretations , 2014, 1409.5351.

[30]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[31]  Johannes Rauh,et al.  Secret Sharing and Shared Information , 2017, Entropy.

[32]  Erik M. Bollt,et al.  Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings , 2014, 1504.03769.

[33]  Gabjin Oh,et al.  Asymmetric information flow between market index and individual stocks in several stock markets , 2012 .