Reliability in Source Coding With Side Information

We study error exponents for source coding with side information. Both achievable exponents and converse bounds are obtained for the following two cases: lossless source coding with coded information and lossy source coding with full side information (Wyner-Ziv). These results recover and extend several existing results on source-coding error exponents and are tight in some circumstances. Our bounds have a natural interpretation as a two-player game between nature and the code designer, with nature seeking to minimize the exponent and the code designer seeking to maximize it. In the Wyner-Ziv problem, our analysis exposes a tension in the choice of test channel with the optimal test channel balancing two competing error events. The Gaussian and binary-erasure cases are examined in detail.

[1]  Michael Gastpar,et al.  Cooperative strategies and capacity theorems for relay networks , 2005, IEEE Transactions on Information Theory.

[2]  Ying Wang,et al.  Capacity and Random-Coding Exponents for Channel Coding With Side Information , 2007, IEEE Transactions on Information Theory.

[3]  W. Marsden I and J , 2012 .

[4]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[5]  Imre Csiszár,et al.  Towards a general theory of source networks , 1980, IEEE Trans. Inf. Theory.

[6]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[7]  Aaron D. Wyner,et al.  On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.

[8]  Pierre Moulin,et al.  On error exponents of modulo lattice additive noise channels , 2006, IEEE Transactions on Information Theory.

[9]  Gregory W. Wornell,et al.  On the excess distortion exponent of the quadratic-Gaussian Wyner-Ziv problem , 2010, 2010 IEEE International Symposium on Information Theory.

[10]  M. Gastpar,et al.  Achievable Error Exponents in Multiterminal Source Coding , 2006, 2006 40th Annual Conference on Information Sciences and Systems.

[11]  Alon Orlitsky,et al.  Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.

[12]  Joseph A. O'Sullivan,et al.  Achievable Rates for Pattern Recognition , 2005, IEEE Transactions on Information Theory.

[13]  Neri Merhav,et al.  Guessing Subject to Distortion , 1998, IEEE Trans. Inf. Theory.

[14]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[15]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[16]  J. Wolfowitz The rate distortion function for source coding with side information at the decoder , 1979 .

[17]  Aaron B. Wagner,et al.  Improved Source Coding Exponents via Witsenhausen's Rate , 2011, IEEE Transactions on Information Theory.

[18]  Bernd Girod,et al.  Distributed Video Coding , 2005, Proceedings of the IEEE.

[19]  Anant Sahai,et al.  Trade-off of lossless source coding error exponents , 2008, 2008 IEEE International Symposium on Information Theory.

[20]  Sergio Verdú,et al.  Operational duality between Gelfand-Pinsker and Wyner-Ziv coding , 2010, 2010 IEEE International Symposium on Information Theory.

[21]  Shun-ichi Amari,et al.  Statistical Inference Under Multiterminal Data Compression , 1998, IEEE Trans. Inf. Theory.

[22]  Kannan Ramchandran,et al.  Duality between source coding and channel coding and its extension to the side information case , 2003, IEEE Trans. Inf. Theory.

[23]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[24]  Te Sun Han,et al.  Universal coding for the Slepian-Wolf data compression system and the strong converse theorem , 1994, IEEE Trans. Inf. Theory.

[25]  Shunsuke Ihara Error Exponent for Coding of Memoryless Gaussian Sources with a Fidelity Criterion , 2000 .

[26]  Srikant Jayaraman,et al.  An error exponent for lossy source coding with side information at the decoder , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[27]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[28]  Anant Sahai,et al.  The Necessity and Sufficiency of Anytime Capacity for Stabilization of a Linear System Over a Noisy Communication Link—Part I: Scalar Systems , 2006, IEEE Transactions on Information Theory.

[29]  N. S. Barnett,et al.  Private communication , 1969 .

[30]  Toby Berger,et al.  Multiterminal source encoding with one distortion criterion , 1989, IEEE Trans. Inf. Theory.

[31]  Rudolf Ahlswede,et al.  Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.

[32]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[33]  Imre Csiszár Linear codes for sources and source networks: Error exponents, universal coding , 1982, IEEE Trans. Inf. Theory.

[34]  Kannan Ramchandran,et al.  Distributed Video Coding and Its Applications , 2007 .

[35]  Aaron B. Wagner,et al.  Optimality of binning for distributed hypothesis testing , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).