The Communication Complexity of Correlation

Let <i>X</i> and <i>Y</i> be finite nonempty sets and <i>(X</i>,<i>Y</i>) a pair of random variables taking values in <i>X</i>?<i>Y</i>. We consider communication protocols between two parties, <b>Alice</b> and <b>Bob</b>, for generating <i>X</i> and <i>Y</i>. <b>Alice</b> is provided an <i>x</i> ? <i>X</i> generated according to the distribution of <i>X</i> , and is required to send a message to <b>Bob</b> in order to enable him to generate <i>y</i> ? <i>Y</i>, whose distribution is the same as that of <i>Y</i>|<i>X</i>=<i>x</i>. Both parties have access to a shared random string generated in advance. Let <i>T</i>[<i>X</i>:<i>Y</i>] be the minimum (over all protocols) of the expected number of bits <b>Alice</b> needs to transmit to achieve this. We show that I[X:Y] ? T[X:Y] ? I [X:Y] + 2 log<sub>2</sub> (I[X:Y]+ O(1). We also consider the worst case communication required for this problem, where we seek to minimize the average number of bits <b>Alice</b> must transmit for the worst case <i>x</i> ? <i>X</i>. We show that the communication required in this case is related to the capacity <i>C</i>(<i>E</i>) of the channel <i>E</i>, derived from <i>(X</i>,<i>Y</i>) , that maps <i>x</i> ? <i>X</i> to the distribution of <i>Y</i>|<i>X</i>=<i>x</i>. We also show that the required communication <i>T</i>(<i>E</i>) satisfies <i>C</i>(<i>E</i>) ? <i>T</i>(<i>E</i>) ? <i>C</i> (<i>E</i>) + 2 log<sub>2</sub> (<i>C</i>(<i>E</i>)+1) + <i>O</i>(1). Using the first result, we derive a direct-sum theorem in communication complexity that substantially improves the previous such result shown by Jain, Radhakrishnan, and Sen [In Proc. 30th International Colloquium of Automata, Languages and Programming (ICALP), ser. Lecture Notes in Computer Science, vol. 2719. 2003, pp. 300-315]. These results are obtained by employing a rejection sampling procedure that relates the relative entropy between two distributions to the communication complexity of generating one distribution from the other.

[1]  L. H. Harper Optimal numberings and isoperimetric problems on graphs , 1966 .

[2]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[3]  Aaron D. Wyner,et al.  The common information of two dependent random variables , 1975, IEEE Trans. Inf. Theory.

[4]  Andrew Chi-Chih Yao,et al.  Probabilistic computations: Toward a unified measure of complexity , 1977, 18th Annual Symposium on Foundations of Computer Science (sfcs 1977).

[5]  B. Bollobás Combinatorics: Set Systems, Hypergraphs, Families of Vectors and Combinatorial Probability , 1986 .

[6]  V. Rich Personal communication , 1989, Nature.

[7]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[8]  Noga Alon,et al.  The Probabilistic Method , 2015, Fundamentals of Ramsey Theory.

[9]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.

[10]  Eyal Kushilevitz,et al.  Communication Complexity , 1997, Adv. Comput..

[11]  Andrew Chi-Chih Yao,et al.  Informational complexity and the direct sum problem for simultaneous message complexity , 2001, Proceedings 2001 IEEE International Conference on Cluster Computing.

[12]  Jaikumar Radhakrishnan,et al.  Privacy and interaction in quantum communication complexity and a theorem about the relative entropy of quantum states , 2002, The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002. Proceedings..

[13]  Ziv Bar-Yossef,et al.  An information statistics approach to data stream and communication complexity , 2002, The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002. Proceedings..

[14]  A. Winter Compression of sources of probability distributions and density operators , 2002, quant-ph/0208131.

[15]  Peter W. Shor,et al.  Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem , 2001, IEEE Trans. Inf. Theory.

[16]  Jaikumar Radhakrishnan,et al.  A Direct Sum Theorem in Communication Complexity via Message Compression , 2003, ICALP.

[17]  Jaikumar Radhakrishnan,et al.  A lower bound for the bounded round quantum communication complexity of set disjointness , 2003, 44th Annual IEEE Symposium on Foundations of Computer Science, 2003. Proceedings..

[18]  Ravi Kumar,et al.  An information statistics approach to data stream and communication complexity , 2004, J. Comput. Syst. Sci..

[19]  Noga Alon,et al.  The Probabilistic Method, Second Edition , 2004 .

[20]  Amit Chakrabarti,et al.  An optimal randomised cell probe lower bound for approximate nearest neighbour searching , 2004, 45th Annual IEEE Symposium on Foundations of Computer Science.

[21]  Jaikumar Radhakrishnan,et al.  Prior entanglement, message compression and privacy in quantum communication , 2005, 20th Annual IEEE Conference on Computational Complexity (CCC'05).

[22]  Rahul Jain,et al.  Communication complexity of remote state preparation with entanglement , 2005, Quantum Inf. Comput..

[23]  Paul W. Cuff,et al.  Communication requirements for generating correlated random variables , 2008, 2008 IEEE International Symposium on Information Theory.

[24]  Paul M. B. Vitányi,et al.  An Introduction to Kolmogorov Complexity and Its Applications, Third Edition , 1997, Texts in Computer Science.

[25]  Jaikumar Radhakrishnan,et al.  A property of quantum relative entropy with an application to privacy in quantum communication , 2009, JACM.

[26]  Jaikumar Radhakrishnan,et al.  The communication complexity of correlation , 2010, IEEE Trans. Inf. Theory.