Second-Order Coding Rates for Conditional Rate-Distortion

This paper characterizes the second-order coding rates for lossy source coding with side information available at both the encoder and the decoder. We first provide non-asymptotic bounds for this problem and then specialize the non-asymptotic bounds for three different scenarios: discrete memoryless sources, Gaussian sources, and Markov sources. We obtain the second-order coding rates for these settings. It is interesting to observe that the second-order coding rate for Gaussian source coding with Gaussian side information available at both the encoder and the decoder is the same as that for Gaussian source coding without side information. Furthermore, regardless of the variance of the side information, the dispersion is $1/2$ nats squared per source symbol.

[1]  R. Gray Conditional Rate-Distortion Theory , 1972 .

[2]  Vincent Yan Fu Tan,et al.  Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities , 2014, Found. Trends Commun. Inf. Theory.

[3]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .

[4]  Jonathan Scarlett,et al.  On the Dispersions of the Gel’fand–Pinsker Channel and Dirty Paper Coding , 2013, IEEE Transactions on Information Theory.

[5]  Vincent Yan Fu Tan,et al.  Second-Order Coding Rates for Channels With State , 2014, IEEE Transactions on Information Theory.

[6]  Sergio Verdú,et al.  A new converse in rate-distortion theory , 2012, 2012 46th Annual Conference on Information Sciences and Systems (CISS).

[7]  Sergio Verdú,et al.  Optimal Lossless Data Compression: Non-Asymptotics and Asymptotics , 2014, IEEE Transactions on Information Theory.

[8]  David J. Sakrison,et al.  A geometric treatment of the source encoding of a Gaussian random variable , 1968, IEEE Trans. Inf. Theory.

[9]  Te Sun Han Folklore in source coding: information-spectrum approach , 2005, IEEE Transactions on Information Theory.

[10]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[11]  R. C. Bradley Basic properties of strong mixing conditions. A survey and some open questions , 2005, math/0511078.

[12]  Masahito Hayashi,et al.  Second-Order Asymptotics in Fixed-Length Source Coding and Intrinsic Randomness , 2005, IEEE Transactions on Information Theory.

[13]  D. J. Bartholomew,et al.  An Introduction to Probability Theory and its Applications , 1967 .

[14]  Sergio Verdú,et al.  Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.

[15]  Zhen Zhang,et al.  The redundancy of source coding with a fidelity criterion: 1. Known statistics , 1997, IEEE Trans. Inf. Theory.

[16]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[17]  Haim H. Permuter,et al.  Source Coding When the Side Information May Be Delayed , 2011, IEEE Transactions on Information Theory.

[18]  Tamás Linder,et al.  On source coding with side-information-dependent distortion measures , 2000, IEEE Trans. Inf. Theory.

[19]  Jonathan Scarlett,et al.  On the dispersion of dirty paper coding , 2014, 2014 IEEE International Symposium on Information Theory.

[20]  Bin Yu,et al.  A rate of convergence result for a universal D-semifaithful code , 1993, IEEE Trans. Inf. Theory.

[21]  Vincent Yan Fu Tan,et al.  Non-asymptotic and second-order achievability bounds for source coding with side-information , 2013, 2013 IEEE International Symposium on Information Theory.

[22]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[23]  Victoria Kostina,et al.  Lossy data compression: Nonasymptotic fundamental limits , 2013 .

[24]  Yuval Kochman,et al.  The Dispersion of Lossy Source Coding , 2011, 2011 Data Compression Conference.

[25]  Barry M. Leiner,et al.  Rate-distortion theory for ergodic sources with side information (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[26]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[27]  Vincent Y. F. Tan,et al.  Moderate-deviations of lossy source coding for discrete and Gaussian sources , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[28]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[29]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[30]  Takis Konstantopoulos,et al.  MARKOV CHAINS AND RANDOM WALKS , 2009 .

[31]  Tsachy Weissman,et al.  Source Coding With Limited-Look-Ahead Side Information at the Decoder , 2006, IEEE Transactions on Information Theory.

[32]  Michael Fleming,et al.  On rate-distortion with mixed types of side information , 2003, IEEE Transactions on Information Theory.

[33]  Alexander Tikhomirov,et al.  On the Convergence Rate in the Central Limit Theorem for Weakly Dependent Random Variables , 1981 .