On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder

Let X and Y denote two jointly memoryless sources with finite alphabets. Suppose that X is to be encoded in a lossy manner with Y as the side information available only at the decoder. A common approach to this lossy source coding problem is to apply conventional vector quantization followed by Slepian-Wolf coding. In this paper we are interested in the rate-distortion performance achievable asymptotically by this approach. Given an arbitrary single letter distortion measure d, it is shown that the best rate achievable asymptotically under the constraint that X is recovered with distortion level no greater than D ges 0 is Rwz(D) = mintimes[I(X;X) -I(Y; X)], where the minimum is taken over all auxiliary random variables X such that Ed(X, X) les D and X rarr X rarr Y is a Markov chain. An extended Blahut-Arimoto algorithm is then proposed to calculate Rwz(D) for any (X,Y) and any distortion measure, and the convergence of the algorithm is also proved. Interestingly, it is observed that the random variable X achieving Rwz(D) is, in general, different from the random variable X' achieving the classical rate-distortion function R(D) of X at distortion D. In particular, it is shown that in the case of binary sources and Hamming distortion measure, the random variable X achieving Rwz (D) is the same as the random variable X' achieving R(D) if and only if the channel pYItimes from X to Y is symmetric. Thus, the design of conventional quantization in the case of side information at the decoder should be different from the case of no side information.

[1]  Kannan Ramchandran,et al.  Distributed source coding using syndromes (DISCUSS): design and construction , 1999 .

[2]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[3]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[4]  Imre Csiszár,et al.  On the computation of rate-distortion functions (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[5]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2006, ISIT.

[6]  R. Gallager Information Theory and Reliable Communication , 1968 .

[7]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[8]  Catarina Brites,et al.  Distributed Video Coding : Bringing New Applications to Life , 2005 .

[9]  N. J. A. Sloane,et al.  Sphere Packings, Lattices and Groups , 1987, Grundlehren der mathematischen Wissenschaften.

[10]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[11]  Kannan Ramchandran,et al.  Duality between source coding and channel coding and its extension to the side information case , 2003, IEEE Trans. Inf. Theory.

[12]  Michel Loève,et al.  Probability Theory I , 1977 .

[13]  Bernd Girod,et al.  Distributed Video Coding , 2005, Proceedings of the IEEE.

[14]  Kannan Ramchandran,et al.  Distributed source coding using syndromes (DISCUS): design and construction , 2003, IEEE Trans. Inf. Theory.

[15]  Suhas N. Diggavi,et al.  The worst additive noise under a covariance constraint , 2001, IEEE Trans. Inf. Theory.

[16]  Aaron D. Wyner,et al.  Recent results in the Shannon theory , 1974, IEEE Trans. Inf. Theory.

[17]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2005, IEEE Transactions on Information Theory.

[18]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[19]  Frans M. J. Willems,et al.  Computation of the Wyner-Ziv Rate-Distortion Function , 1983 .

[20]  Kannan Ramchandran,et al.  Distributed compression in a dense microsensor network , 2002, IEEE Signal Process. Mag..