Interactive Channel Capacity Revisited

We provide the first capacity approaching coding schemes that robustly simulate any interactive protocol over an adversarial channel that corrupts any fraction of the transmitted symbols. Our coding schemes achieve a communication rate of 1 - O(∈√loglog1/∈) can be improved to 1 - O(√∈) for random, oblivious, and over any adversarial channel. This computationally bounded channels, or if parties have shared randomness unknown to the channel. Surprisingly, these rates exceed the 1 - Ω( H(ϵ)) = 1 - Ω(ϵ√log1/ϵ) interactive channel capacity bound which [Kol and Raz; STOC'13] recently proved for random errors. We conjecture 1- Θ(ϵ log log 1/ϵ) and 1- Θ(√ϵ) to be the optimal rates for their respective settings and therefore to capture the interactive channel capacity for random and adversarial errors. In addition to being very communication efficient, our randomized coding schemes have multiple other advantages. They are computationally efficient, extremely natural, and significantly simpler than prior (non-capacity approaching) schemes. In particular, our protocols do not employ any coding but allow the original protocol to be performed as-is, interspersed only by short exchanges of hash values. When hash values do not match, the parties backtrack. Our approach is, as we feel, by far the simplest and most natural explanation for why and how robust interactive communication in a noisy environment is possible.

[1]  Madhu Sudan,et al.  Optimal error rates for interactive coding I: adaptivity and other settings , 2013, STOC.

[2]  Bernhard Haeupler,et al.  Optimal Error Rates for Interactive Coding II: Efficiency and List Decoding , 2013, 2014 IEEE 55th Annual Symposium on Foundations of Computer Science.

[3]  David Peleg,et al.  Distributed Computing: A Locality-Sensitive Approach , 1987 .

[4]  Moni Naor,et al.  Fast Algorithms for Interactive Coding , 2013, SODA.

[5]  Mark Braverman Coding for interactive computation: Progress and challenges , 2012, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[6]  Rafail Ostrovsky,et al.  Optimal Coding for Streaming Authentication and Interactive Communication , 2015, IEEE Transactions on Information Theory.

[7]  Mark Braverman,et al.  Toward Coding for Maximum Errors in Interactive Communication , 2011, IEEE Transactions on Information Theory.

[8]  Amit Sahai,et al.  Efficient and Explicit Coding for Interactive Communication , 2011, 2011 IEEE 52nd Annual Symposium on Foundations of Computer Science.

[9]  Ran Raz,et al.  Interactive channel capacity , 2013, STOC '13.

[10]  Moni Naor,et al.  Small-bias probability spaces: efficient constructions and applications , 1990, STOC '90.

[11]  Klim Efremenko,et al.  Maximal Noise in Interactive Communication Over Erasure Channels and Channels With Feedback , 2015, IEEE Transactions on Information Theory.

[12]  Leonard J. Schulman,et al.  Communication on noisy channels: a coding theorem for computation , 1992, Proceedings., 33rd Annual Symposium on Foundations of Computer Science.

[13]  Leonard J. Schulman Coding for interactive communication , 1996, IEEE Trans. Inf. Theory.

[14]  Mark Braverman,et al.  List and Unique Coding for Interactive Communication in the Presence of Adversarial Noise , 2014, FOCS.

[15]  Ran Gelles,et al.  Capacity of Interactive Communication over Erasure Channels and Channels with Feedback , 2015, SIAM J. Comput..

[16]  Yael Tauman Kalai,et al.  Efficient Interactive Coding against Adversarial Noise , 2012, 2012 IEEE 53rd Annual Symposium on Foundations of Computer Science.