Unique Information and Secret Key Agreement

The partial information decomposition (PID) is a promising framework for decomposing a joint random variable into the amount of influence each source variable Xi has on a target variable Y, relative to the other sources. For two sources, influence breaks down into the information that both X0 and X1 redundantly share with Y, what X0 uniquely shares with Y, what X1 uniquely shares with Y, and finally what X0 and X1 synergistically share with Y. Unfortunately, considerable disagreement has arisen as to how these four components should be quantified. Drawing from cryptography, we consider the secret key agreement rate as an operational method of quantifying unique information. Secret key agreement rate comes in several forms, depending upon which parties are permitted to communicate. We demonstrate that three of these four forms are inconsistent with the PID. The remaining form implies certain interpretations as to the PID’s meaning—interpretations not present in PID’s definition but that, we argue, need to be explicit. Specifically, the use of a consistent PID quantified using a secret key agreement rate naturally induces a directional interpretation of the PID. We further reveal a surprising connection between third-order connected information, two-way secret key agreement rate, and synergy. We also consider difficulties which arise with a popular PID measure in light of the results here as well as from a maximum entropy viewpoint. We close by reviewing the challenges facing the PID.

[1]  Marian Verhelst,et al.  Understanding Interdependency Through Complex Information Sharing , 2015, Entropy.

[2]  James P. Crutchfield,et al.  dit: a Python package for discrete information theory , 2018, J. Open Source Softw..

[3]  U. Maurer,et al.  Secret key agreement by public discussion from common information , 1993, IEEE Trans. Inf. Theory.

[4]  Robin A. A. Ince The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal , 2017, ArXiv.

[5]  Eckehard Olbrich,et al.  Quantifying unique information , 2013, Entropy.

[6]  Ueli Maurer,et al.  Unconditionally Secure Key Agreement and the Intrinsic Conditional Information , 1999, IEEE Trans. Inf. Theory.

[7]  Michael J. Berry,et al.  Network information and connected correlations. , 2003, Physical review letters.

[8]  Randall D. Beer,et al.  Nonnegative Decomposition of Multivariate Information , 2010, ArXiv.

[9]  Rudolf Ahlswede,et al.  Common Randomness in Information Theory and Cryptography - Part II: CR Capacity , 1998, IEEE Trans. Inf. Theory.

[10]  Eckehard Olbrich,et al.  On extractable shared information , 2017, Entropy.

[11]  Christof Koch,et al.  Quantifying synergistic mutual information , 2012, ArXiv.

[12]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[13]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[14]  James P. Crutchfield,et al.  Unique information via dependency constraints , 2017, Journal of Physics A: Mathematical and Theoretical.

[15]  Robin A. A. Ince Measuring multivariate redundant information with pointwise common change in surprisal , 2016, Entropy.

[16]  Johannes Rauh,et al.  Secret Sharing and Shared Information , 2017, Entropy.

[17]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[18]  Christoph Salge,et al.  A Bivariate Measure of Redundant Information , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[19]  Amin Gohari,et al.  Coding for Positive Rate in the Source Model Key Agreement Problem , 2017, IEEE Transactions on Information Theory.

[20]  Praveen Kumar,et al.  Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables , 2017 .

[21]  Joseph T. Lizier,et al.  Pointwise Partial Information DecompositionUsing the Specificity and Ambiguity Lattices , 2018, Entropy.

[22]  Muriel Médard,et al.  Maximum Entropy Functions: Approximate Gacs-Korner for Distributed Compression , 2016, ArXiv.

[23]  Eric Chitambar,et al.  The Conditional Common Information in Classical and Quantum Secret Key Distillation , 2018, IEEE Transactions on Information Theory.

[24]  Eckehard Olbrich,et al.  Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems , 2012, ArXiv.

[25]  Rudolf Ahlswede,et al.  Common randomness in information theory and cryptography - I: Secret sharing , 1993, IEEE Trans. Inf. Theory.

[26]  Shun-ichi Amari,et al.  Information geometry on hierarchy of probability distributions , 2001, IEEE Trans. Inf. Theory.

[27]  E. T. Jaynes,et al.  Where do we Stand on Maximum Entropy , 1979 .