Let X and Y be two jointly distributed random variables. Suppose person P/sub X/, the informant, knows X, and person P/sub Y/, the recipient, knows Y, and both know the joint probability distribution of the pair (X,Y). Using a predetermined protocol, they communicate over a binary error-free channel in order for P/sub Y/ to learn X, whereas P/sub X/ may or may not learn Y. C/spl circ/m(X/spl verbar/Y) is the minimum number of bits required to be transmitted (by both persons) in the worst case when only m message exchanges are allowed. C/spl circ//spl infin/(X/spl verbar/Y) is the number of bits required when P/sub X/ and P/sub Y/ can communicate back and forth an arbitrary number of times. Orlitsky proved that for all (X,Y) pairs, C/spl circ//sub 2/(X/spl verbar/Y)/spl les/4C/spl circ//spl infin/(X/spl verbar/Y)+3, and that for every positive c and /spl isin/ with /spl isin/ 0, there exist some (X,Y) pairs for which C/spl circ//sub 3/(X/spl verbar/Y)/spl ges/(2/spl minus//spl isin/)C/spl circ//sub 4/(X/spl verbar/Y)/spl ges/c. That is, three messages are not optimal either. >
[1]
Christos H. Papadimitriou,et al.
Communication complexity
,
1982,
STOC '82.
[2]
Alon Orlitsky,et al.
Worst-case interactive communication I: Two messages are almost optimal
,
1990,
IEEE Trans. Inf. Theory.
[3]
Alon Orlitsky,et al.
Interactive Data Comparison
,
1984,
FOCS.
[4]
Zvi Galil,et al.
Lower bounds on communication complexity
,
1984,
STOC '84.
[5]
Alon Orlitsky,et al.
Worst-case interactive communication - II: Two messages are not optimal
,
1991,
IEEE Trans. Inf. Theory.