Tight Bounds for Communication-Assisted Agreement Distillation

Suppose Alice holds a uniformly random string X ∈ {0, 1}N and Bob holds a noisy version Y of X where each bit of X is flipped independently with probability e ∈ [0, 1/2]. Alice and Bob would like to extract a common random string of min-entropy at least k. In this work, we establish the communication versus success probability trade-off for this problem by giving a protocol and a matching lower bound (under the restriction that the string to be agreed upon is determined by Alice's input X). Specifically, we prove that in order for Alice and Bob to agree on a common string with probability 2-γk (γk ≥ 1), the optimal communication (up to o(k) terms, and achievable for large N) is precisely [EQUATION], where C:= 4e(1 - e). In particular, the optimal communication to achieve Ω(1) agreement probability approaches 4e(1 - e)k. We also consider the case when Y is the output of the binary erasure channel on X, where each bit of Y equals the corresponding bit of X with probability 1 - e and is otherwise erased (that is, replaced by a '?'). In this case, the communication required becomes [EQUATION]. In particular, the optimal communication to achieve Ω(1) agreement probability approaches ek, and with no communication the optimal agreement probability approaches [EQUATION]. Our protocols are based on covering codes and extend the approach of (Bogdanov and Mossel, 2011) for the zero-communication case. Our lower bounds rely on hypercontractive inequalities. For the model of bit-flips, our argument extends the approach of (Bogdanov and Mossel, 2011) by allowing communication; for the erasure model, to the best of our knowledge the needed hypercontractivity statement was not studied before, and it was established (given our application) by (Nair and Wang 2015). We also obtain information complexity lower bounds for these tasks, and together with our protocol, they shed light on the recently popular "most informative Boolean function" conjecture of Courtade and Kumar.

[1]  Elza Erkip,et al.  The Efficiency of Investment Information , 1998, IEEE Trans. Inf. Theory.

[2]  William Feller,et al.  An Introduction to Probability Theory and Its Applications , 1951 .

[3]  Konstantin Makarychev,et al.  Chain Independence and Common Information , 2012, IEEE Transactions on Information Theory.

[4]  Venkat Anantharam,et al.  On Maximal Correlation, Hypercontractivity, and the Data Processing Inequality studied by Erkip and Cover , 2013, ArXiv.

[5]  Alex Samorodnitsky The "Most informative boolean function" conjecture holds for high noise , 2015, ArXiv.

[6]  Elchanan Mossel,et al.  Non-interactive correlation distillation, inhomogeneous Markov chains, and the reverse Bonami-Beckner inequality , 2004, math/0410560.

[7]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.

[8]  Lei Zhao,et al.  The efficiency of common randomness generation , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[9]  Alex Samorodnitsky,et al.  On the Entropy of a Noisy Function , 2015, IEEE Transactions on Information Theory.

[10]  V. Chandar,et al.  Most Informative Quantization Functions , 2014 .

[11]  Martin Bossert,et al.  Canalizing Boolean Functions Maximize Mutual Information , 2012, IEEE Transactions on Information Theory.

[12]  Ryan O'Donnell,et al.  Remarks on the Most Informative Function Conjecture at fixed mean , 2015 .

[13]  Thomas A. Courtade,et al.  Which Boolean Functions Maximize Mutual Information on Noisy Inputs? , 2014, IEEE Transactions on Information Theory.

[14]  Elchanan Mossel,et al.  On Extracting Common Random Bits From Correlated Sources , 2011, IEEE Transactions on Information Theory.

[15]  Ke Yang,et al.  On the (im)possibility of non-interactive correlation distillation , 2004, Theor. Comput. Sci..

[16]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-I , 1973, IEEE Trans. Inf. Theory.

[17]  Ryan O'Donnell,et al.  Coin flipping from a cosmic source: On error correction of truly random bits , 2004, Random Struct. Algorithms.

[18]  Chandra Nair,et al.  Evaluating hypercontractivity parameters using information measures , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[19]  Rudolf Ahlswede,et al.  Common Randomness in Information Theory and Cryptography - Part II: CR Capacity , 1998, IEEE Trans. Inf. Theory.

[20]  Ryan O'Donnell,et al.  Analysis of Boolean Functions , 2014, ArXiv.

[21]  Venkat Anantharam,et al.  On hypercontractivity and a data processing inequality , 2014, 2014 IEEE International Symposium on Information Theory.

[22]  Venkat Anantharam,et al.  On hypercontractivity and the mutual information between Boolean functions , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[23]  Omri Weinstein,et al.  Dictatorship is the Most Informative Balanced Function at the Extremes , 2015, Electron. Colloquium Comput. Complex..