Source Coding When the Side Information May Be Delayed

For memoryless sources, delayed side information at the decoder does not improve the rate-distortion function. However, this is not the case for sources with memory, as demonstrated by a number of works focusing on the special case of (delayed) feedforward. In this paper, a setting is studied in which the encoder is potentially uncertain about the delay with which measurements of the side information, which is available at the encoder, are acquired at the decoder. Assuming a hidden Markov model for the source sequences, at first, a single-letter characterization is given for the setup where the side information delay is arbitrary and known at the encoder, and the reconstruction at the destination is required to be asymptotically lossless. Then, with delay equal to zero or one source symbol, a single-letter characterization of the rate-distortion region is given for the case where, unbeknownst to the encoder, the side information may be delayed or not. Finally, examples for binary and Gaussian sources are provided.

[1]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[2]  Tsachy Weissman,et al.  Source Coding With Limited-Look-Ahead Side Information at the Decoder , 2006, IEEE Transactions on Information Theory.

[3]  Neri Merhav,et al.  On Successive Refinement for the Kaspi/Heegard–Berger Problem , 2010, IEEE Transactions on Information Theory.

[4]  William Equitz,et al.  Successive refinement of information , 1991, IEEE Trans. Inf. Theory.

[5]  Haim H. Permuter,et al.  Computable Bounds for Rate Distortion With Feed Forward for Stationary and Ergodic Sources , 2013, IEEE Transactions on Information Theory.

[6]  T. Cover,et al.  Rate Distortion Theory , 2001 .

[7]  Haim H. Permuter,et al.  Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing , 2009, IEEE Transactions on Information Theory.

[8]  L. Goddard Information Theory , 1962, Nature.

[9]  Robert G. Gallager,et al.  Discrete Stochastic Processes , 1995 .

[10]  R. Gray Conditional Rate-Distortion Theory , 1972 .

[11]  Sanjeev R. Kulkarni,et al.  An Algorithm for Universal Lossless Compression With Side Information , 2006, IEEE Transactions on Information Theory.

[12]  Badri N. Vellambi,et al.  Two lossy source coding problems with causal side-information , 2009, 2009 IEEE International Symposium on Information Theory.

[13]  Tsachy Weissman,et al.  Coding for the feedback Gel'fand-Pinsker channel and the feedforward Wyner-Ziv source , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[14]  Neri Merhav,et al.  On Successive Refinement With Causal Side Information at the Decoders , 2008, IEEE Transactions on Information Theory.

[15]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[16]  Tsachy Weissman,et al.  On competitive prediction and its relation to rate-distortion theory , 2003, IEEE Trans. Inf. Theory.

[17]  Amiram H. Kaspi,et al.  Rate-distortion function when side-information may be present at the decoder , 1994, IEEE Trans. Inf. Theory.

[18]  Frans M. J. Willems,et al.  The context-tree weighting method: basic properties , 1995, IEEE Trans. Inf. Theory.

[19]  Robert M. Gray,et al.  Information rates of autoregressive processes , 1970, IEEE Trans. Inf. Theory.

[20]  Neri Merhav,et al.  On successive refinement for the Wyner-Ziv problem , 2004, IEEE Transactions on Information Theory.

[21]  Gerhard Kramer Capacity results for the discrete memoryless network , 2003, IEEE Trans. Inf. Theory.

[22]  Ramji Venkataramanan,et al.  On Computing the Feedback Capacity of Channels and the Feed-Forward Rate-Distortion Function of Sources , 2010, IEEE Transactions on Communications.

[23]  Neri Merhav,et al.  On successive refinement for the Wyner-Ziv problem , 2004, ISIT.

[24]  JORMA RISSANEN,et al.  A universal data compression system , 1983, IEEE Trans. Inf. Theory.

[25]  Lossy source coding for a cascade communication system with side-informations , 2006 .

[26]  S. Sandeep Pradhan,et al.  On the Role of Feedforward in Gaussian Sources: Point-to-Point Source Coding and Multiple Description Source Coding , 2007, IEEE Transactions on Information Theory.

[27]  Ramji Venkataramanan,et al.  Source Coding With Feed-Forward: Rate-Distortion Theorems and Error Exponents for a General Source , 2007, IEEE Transactions on Information Theory.

[28]  Dinkar Vasudevan Bounds to the Rate Distortion Tradeoff of the Binary Markov Source , 2007, 2007 Data Compression Conference (DCC'07).

[29]  Pravin Varaiya,et al.  Capacity of fading channels with channel side information , 1997, IEEE Trans. Inf. Theory.