Source Coding With a Side Information “Vending Machine”

We study source coding in the presence of side information, when the system can take actions that affect the availability, quality, or nature of the side information. We begin by extending the Wyner-Ziv problem of source coding with decoder side information to the case where the decoder is allowed to choose actions affecting the side information. We then consider the setting where actions are taken by the encoder, based on its observation of the source. Actions may have costs that are commensurate with the quality of the side information they yield, and an overall per-symbol cost constraint may be imposed. We characterize the achievable tradeoffs between rate, distortion, and cost in some of these problem settings. Among our findings is the fact that even in the absence of a cost constraint, greedily choosing the action associated with the “best” side information is, in general, suboptimal. A few examples are worked out.

[1]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[2]  J. Wolfowitz The rate distortion function for source coding with side information at the decoder , 1979 .

[3]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[4]  E. Telatar,et al.  The Kaspi Rate-Distortion Problem with Encoder Side-Information: Binary Erasure Case , 2006 .

[5]  Thomas M. Cover,et al.  A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources , 1971 .

[6]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[7]  Aaron D. Wyner,et al.  A Definition of Conditional Mutual Information for Arbitrary Ensembles , 1978, Inf. Control..

[8]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[9]  Tsachy Weissman,et al.  Source Coding With Limited-Look-Ahead Side Information at the Decoder , 2006, IEEE Transactions on Information Theory.

[10]  Tsachy Weissman,et al.  The Information Lost in Erasures , 2008, IEEE Transactions on Information Theory.

[11]  Tsachy Weissman,et al.  Capacity of Channels With Action-Dependent States , 2009, IEEE Transactions on Information Theory.

[12]  Mung Chiang,et al.  Tradeoff between message and state information rates , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[13]  Gregory W. Wornell,et al.  Source Coding With Distortion Side Information , 2008, IEEE Transactions on Information Theory.

[14]  Young-Han Kim,et al.  State Amplification , 2008, IEEE Transactions on Information Theory.

[15]  Frans M. J. Willems,et al.  Computation of the Wyner-Ziv Rate-Distortion Function , 1983 .

[16]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[17]  Emmanuel J. Candès,et al.  Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? , 2004, IEEE Transactions on Information Theory.

[18]  Tsachy Weissman,et al.  Source Coding with Limited Side Information Lookahead at the Decoder , 2006, 2006 IEEE International Symposium on Information Theory.

[19]  R. A. McDonald,et al.  Noiseless Coding of Correlated Information Sources , 1973 .