List decoding with side information

Under list decoding of error-correcting codes, the decoding algorithm is allowed to output a small list of codewords that are close to the noisy received word. This relaxation permits recovery even under very high noise thresholds. We consider one possible scenario that would permit disambiguating between the elements of the list, namely where the sender of the message provides some hopefully small amount of side information about the transmitted message on a separate auxiliary channel that is noise-free. This setting becomes meaningful and useful when the amount of side information that needs to be communicated is much smaller than the length of the message. We study what kind of side information is necessary and sufficient in the above context. The short, conceptual answer is that the side information must be randomized and the message recovery is with a small failure probability. Specifically, we prove that deterministic schemes, which guarantee correct recovery of the message, provide no savings and essentially the entire message has to be sent as side information. However there exist randomized schemes, which only need side information of length logarithmic in the message length. In fact, in the limit of repeated communication of several messages, amortized amount of side information needed per message can be a constant independent of the message length or the failure probability. Concretely, we can correct up to a fraction (1/2-/spl gamma/) of errors for binary codes using only 2log(1//spl gamma/)+O(1) amortized bits of side information per message, and this is in fact the best possible (up to additive constant terms).

[1]  László Lovász,et al.  On the ratio of optimal integral and fractional covers , 1975, Discret. Math..

[2]  Moni Naor,et al.  Three results on interactive communication , 1993, IEEE Trans. Inf. Theory.

[3]  Moni Naor,et al.  Amortized Communication Complexity , 1995, SIAM J. Comput..

[4]  H. S. WITSENHAUSEN,et al.  The zero-error side information problem and chromatic numbers (Corresp.) , 1976, IEEE Trans. Inf. Theory.

[5]  N. Alon,et al.  Repeated communication and Ramsey graphs , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[6]  Ning Cai,et al.  On interactive communication , 1997, IEEE Trans. Inf. Theory.

[7]  Uriel Feige,et al.  Randomized graph products, chromatic numbers, and the Lovász ϑ-function , 1997, Comb..

[8]  E. Kushilevitz,et al.  Communication Complexity: Basics , 1996 .

[9]  Noga Alon,et al.  Approximating the independence number via theϑ-function , 1998, Math. Program..

[10]  Donald E. Knuth The Sandwich Theorem , 1994, Electron. J. Comb..

[11]  Nathan Linial,et al.  Graph products and chromatic numbers , 1989, 30th Annual Symposium on Foundations of Computer Science.

[12]  Noga Alon,et al.  Construction of asymptotically good low-rate error-correcting codes through pseudo-random graphs , 1992, IEEE Trans. Inf. Theory.

[13]  Ba-Zhong Shen A Justesen construction of binary concatenated codes that asymptotically meet the Zyablov bound for low rate , 1993, IEEE Trans. Inf. Theory.

[14]  Madhu Sudan List decoding: algorithms and applications , 2000, SIGA.

[15]  Alon Orlitsky,et al.  Worst-case interactive communication I: Two messages are almost optimal , 1990, IEEE Trans. Inf. Theory.

[16]  Venkatesan Guruswami,et al.  Combinatorial bounds for list decoding , 2002, IEEE Trans. Inf. Theory.