On hierarchical joint source-channel coding with degraded side information

We extend the setting of two-stage lossy source coding with successive refinement structures into a joint source-channel coding setting. In particular, we consider a problem where two descriptions of a memoryless source are to be transmitted across two independent memoryless channels and where the output of the channel corresponding to the first (coarse) description is also available to the decoder of the second (refinement) decoder. Side information (SI), correlated to the source, may also be available to the decoders. In such a case, we confine attention to degraded SI, in the sense that the source, the SI available at the refinement decoder, and the SI available at the coarse decoder form a Markov chain in this order. Our first result is a separation theorem asserting that in the limit of long blocks, no optimality is lost by first applying lossy successive-refinement source coding, regardless of the channels, and then applying good channel codes to each one of the resulting bitstreams, regardless of the source and the SI. It is also shown that (even noiseless) feedback from the output of the first channel to the input of the second encoder cannot improve performance, but may sometimes significantly facilitate the implementation of optimum codes. We provide two examples where single-letter codes (of unit block length) achieve optimum performance, if feedback from the channel output of the first stage is provided to the encoder of the refinement stage. In one of these examples, it is evident that if feedback is not provided, optimality cannot be achieved with unit length code. Motivated by these examples, we then investigate single-letter codes for this system. Necessary and sufficient conditions are furnished for the optimality of single-letter codes with and without feedback. A corollary of these conditions is that for the quadratic distortion measure, feedback is necessary to achieve optimality in single-letter codes, regardless of the source distribution and the channel statistics

[1]  Shlomo Shamai,et al.  On joint source-channel coding for the Wyner-Ziv source and the Gel'fand-Pinsker channel , 2003, IEEE Trans. Inf. Theory.

[2]  Toby Berger,et al.  Failure of successive refinement for symmetric Gaussian mixtures , 1997, IEEE Trans. Inf. Theory.

[3]  Michelle Effros Distortion-rate bounds for fixed- and variable-rate multiresolution source codes , 1999, IEEE Trans. Inf. Theory.

[4]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[5]  Neri Merhav,et al.  Hierarchical Guessing with a Fidelity Criterion , 1999, IEEE Trans. Inf. Theory.

[6]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[7]  Michelle Effros Universal multiresolution source codes , 2001, IEEE Trans. Inf. Theory.

[8]  C. Parman To code, or not to code? , 2003, The Journal of oncology management : the official journal of the American College of Oncology Administrators.

[9]  Kenneth Rose,et al.  Additive successive refinement , 2003, IEEE Trans. Inf. Theory.

[10]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[11]  Shlomo Shamai,et al.  Systematic Lossy Source/Channel Coding , 1998, IEEE Trans. Inf. Theory.

[12]  Michelle Effros,et al.  Improved bounds for the rate loss of multiresolution source codes , 2003, IEEE Trans. Inf. Theory.

[13]  Kenneth Rose,et al.  On hierarchical type covering , 2004, IEEE Transactions on Information Theory.

[14]  Michael Gastpar,et al.  To code, or not to code: lossy source-channel communication revisited , 2003, IEEE Trans. Inf. Theory.

[15]  Neri Merhav,et al.  On successive refinement for the Wyner-Ziv problem , 2004, IEEE Transactions on Information Theory.

[16]  Toby Berger,et al.  All sources are nearly successively refinable , 2001, IEEE Trans. Inf. Theory.

[17]  Toby Berger,et al.  Rate distortion when side information may be absent , 1985, IEEE Trans. Inf. Theory.

[18]  Robert M. Gray,et al.  Source coding for a simple network , 1974 .

[19]  William Equitz,et al.  Successive refinement of information , 1991, IEEE Trans. Inf. Theory.

[20]  Prakash Narayan,et al.  Error exponents for successive refinement by partitioning , 1996, IEEE Trans. Inf. Theory.

[21]  A. Banerjee Convex Analysis and Optimization , 2006 .

[22]  T. Berger,et al.  On the refinement of the binary symmetric Markov source , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[23]  Bixio Rimoldi,et al.  Successive refinement of information: characterization of the achievable rates , 1994, IEEE Trans. Inf. Theory.