On causal source codes with side information

We study the effect of the introduction of side information into the causal source coding setting of Neuhoff and Gilbert. We find that the spirit of their result, namely, the sufficiency of time-sharing scalar quantizers (followed by appropriate lossless coding) for attaining optimum performance within the family of causal source codes, extends to many scenarios involving availability of side information (at both encoder and decoder, or only on one side). For example, in the case where side information is available at both encoder and decoder, we find that time-sharing side-information-dependent scalar quantizers (at most two for each side-information symbol) attains optimum performance. This remains true even when the reproduction sequence is allowed noncausal dependence on the side information and even for the case where the source and the side information, rather than consisting of independent and identically distributed (i.i.d.) pairs, form, respectively, the output of a memoryless channel and its stationary ergodic input.

[1]  Mung Chiang,et al.  Duality between channel capacity and rate distortion with two-sided state information , 2002, IEEE Trans. Inf. Theory.

[2]  Shlomo Shamai,et al.  On the capacity of some channels with channel state information , 1999, IEEE Trans. Inf. Theory.

[3]  Thomas M. Cover,et al.  A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources , 1971 .

[4]  Tamás Linder,et al.  A zero-delay sequential scheme for lossy coding of individual sequences , 2001, IEEE Trans. Inf. Theory.

[5]  Demosthenis Teneketzis Optimal Real-Time Encoding-Decoding of Markov Sources in Noisy Environments , 2004 .

[6]  Tsachy Weissman,et al.  On limited-delay lossy coding and filtering of individual sequences , 2002, IEEE Trans. Inf. Theory.

[7]  Bhaskar D. Rao,et al.  Multiple antenna channels with partial channel state information at the transmitter , 2004, IEEE Transactions on Wireless Communications.

[8]  Tamás Linder,et al.  Optimal entropy-constrained scalar quantization of a uniform source , 2000, IEEE Trans. Inf. Theory.

[9]  Ram Zamir,et al.  Causal source coding of stationary sources with high resolution , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[10]  Rudolf Ahlswede,et al.  Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.

[11]  Sergio VerdÂ,et al.  Fading Channels: InformationTheoretic and Communications Aspects , 2000 .

[12]  Max H. M. Costa,et al.  Writing on dirty paper , 1983, IEEE Trans. Inf. Theory.

[13]  G. Lugosi,et al.  A "follow the perturbed leader"-type algorithm for zero-delay quantization of individual sequences , 2004, DCC 2004.

[14]  R. K. Gilbert,et al.  BOUNDS TO THE PERFORMANCE OF CAUSAL CODES FOR MARKOV SOURCES. , 1979 .

[15]  Abbas El Gamal,et al.  On the capacity of computer memory with defects , 1983, IEEE Trans. Inf. Theory.

[16]  Tamás Linder,et al.  Efficient adaptive algorithms and minimax bounds for zero-delay lossy source coding , 2004, IEEE Transactions on Signal Processing.

[17]  N. THOMAS GAARDER,et al.  On optimal finite-state digital transmission systems , 1982, IEEE Trans. Inf. Theory.

[18]  Young-Han Kim,et al.  Multiple user writing on dirty paper , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[19]  Neri Merhav,et al.  Source coding exponents for zero-delay coding with finite memory , 2003, IEEE Trans. Inf. Theory.

[20]  Tamás Linder,et al.  A "follow the perturbed leader"-type algorithm for zero-delay quantization of individual sequences , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[21]  Tadao Kasami,et al.  An error correcting scheme for defective memory , 1978, IEEE Trans. Inf. Theory.

[22]  Tamás Linder,et al.  On the structure of optimal entropy-constrained scalar quantizers , 2002, IEEE Trans. Inf. Theory.

[23]  Jean C. Walrand,et al.  Optimal causal coding - decoding problems , 1983, IEEE Trans. Inf. Theory.

[24]  Prakash Narayan,et al.  Capacities of time-varying multiple-access channels with side information , 2001, Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494).

[25]  H. Witsenhausen On the structure of real-time source coders , 1979, The Bell System Technical Journal.

[26]  William Equitz,et al.  Successive refinement of information , 1991, IEEE Trans. Inf. Theory.

[27]  David L. Neuhoff,et al.  Causal source codes , 1982, IEEE Trans. Inf. Theory.

[28]  Yossef Steinberg,et al.  On Coding with Rate-Limited Side Information , 2003 .

[29]  R. A. McDonald,et al.  Noiseless Coding of Correlated Information Sources , 1973 .

[30]  Srikrishna Bhashyam,et al.  Feedback gain in multiple antenna systems , 2002, IEEE Trans. Commun..

[31]  David L. Neuhoff,et al.  Quantization , 2022, IEEE Trans. Inf. Theory.

[32]  Tsachy Weissman,et al.  On competitive zero-delay joint source-channel coding , 2004 .

[33]  Abbas El Gamal,et al.  Achievable rates for multiple descriptions , 1982, IEEE Trans. Inf. Theory.

[34]  M. Salehi Capacity and coding for memories with real-time noisy defect information at encoder and decoder , 1992 .