Interactive Communication of Balanced Distributions and of Correlated Files

$( X,Y )$ is a pair of random variables distributed over a support set S. Person $P_X $ knows X, person $P_Y $ knows Y, and both know S. Using a predetermined protocol, they exchange binary messages for $P_Y $ to learn X. $P_X$ may or may not learn Y. The m-message complexity $\hat C_m $ is the number of information bits that must be transmitted (by both persons) in, the worst case if only m messages are allowed. $\hat C_\infty $ is the number of bits required when there is no restriction on the number of messages exchanged.A natural class of random pairs is considered. $\hat \mu $ is the maximum number of X values possible with a given Y value. $\hat \eta $ is the maximum number of Y values possible with a given X value. The random pair $( X,Y )$ is balanced if $\hat \mu = \hat \eta $. The following hold for all balanced random pairs. One-way communication requires at most twice the minimum number of bits: $\hat C_1 \leqq 2\hat C_\infty + 1$. This bound is almost tight: For every $\alpha $, there is a ba...