Near sufficiency of random coding for two descriptions

We give a single-letter outer bound for the two-descriptions problem for independent and identically distributed (i.i.d.) sources that is universally close to the El Gamal and Cover (EGC) inner bound. The gaps for the sum and individual rates using a quadratic distortion measure are upper-bounded by 1.5 and 0.5 bits/sample, respectively, and are universal with respect to the source being encoded and the desired distortion levels. Variants of our basic ideas are presented, including upper and lower bounds on the second channel's rate when the first channel's rate is arbitrarily close to the rate-distortion function; these bounds differ, in the limit as the code block length goes to infinity, by not more than 2 bits/sample. An interesting aspect of our methodology is the manner in which the matching single-letter outer bound is obtained, as we eschew common techniques for constructing single-letter bounds in favor of new ideas in the field of rate loss bounds. We expect these techniques to be generally applicable to other settings of interest.

[1]  Toby Berger,et al.  Multiple description source coding with no excess marginal rate , 1995, IEEE Trans. Inf. Theory.

[2]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[3]  Cathy H. Xia,et al.  Distributed source coding in dense sensor networks , 2005, Data Compression Conference.

[4]  Ram Zamir,et al.  The rate loss in the Wyner-Ziv problem , 1996, IEEE Trans. Inf. Theory.

[5]  Abbas El Gamal,et al.  Achievable rates for multiple descriptions , 1982, IEEE Trans. Inf. Theory.

[6]  Chao Tian,et al.  Multiple Description Quantization Via Gram–Schmidt Orthogonalization , 2005, IEEE Transactions on Information Theory.

[7]  John C. Kieffer,et al.  A survey of the theory of source coding with a fidelity criterion , 1993, IEEE Trans. Inf. Theory.

[8]  Michelle Effros,et al.  Improved bounds for the rate loss of multiresolution source codes , 2003, IEEE Trans. Inf. Theory.

[9]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[10]  Vittorio Castelli,et al.  Near tightness of the El Gamal and Cover region for two descriptions , 2005, Data Compression Conference.

[11]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[12]  Michelle Effros,et al.  On the achievable region for multiple description source codes on Gaussian sources , 2003, IEEE International Symposium on Information Theory, 2003. Proceedings..

[13]  Abel M. Rodrigues Matrix Algebra Useful for Statistics , 2007 .

[14]  Toby Berger,et al.  New results in binary multiple descriptions , 1987, IEEE Trans. Inf. Theory.

[15]  Meir Feder,et al.  Information rates of pre/post-filtered dithered quantizers , 1993, IEEE Trans. Inf. Theory.

[16]  Jacob Ziv,et al.  On universal quantization , 1985, IEEE Trans. Inf. Theory.

[17]  Hanying Feng Rate loss of network source codes , 2002 .

[18]  L. Ozarow,et al.  On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.

[19]  Ram Zamir Gaussian codes and Shannon bounds for multiple descriptions , 1999, IEEE Trans. Inf. Theory.

[20]  Toby Berger,et al.  All sources are nearly successively refinable , 2001, IEEE Trans. Inf. Theory.

[21]  Rudolf Ahlswede,et al.  The rate-distortion region for multiple descriptions without excess rate , 1985, IEEE Trans. Inf. Theory.

[22]  Ram Zamir,et al.  Dithered lattice-based quantizers for multiple descriptions , 2002, IEEE Trans. Inf. Theory.

[23]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .