Discrete Lossy Gray–Wyner Revisited: Second-Order Asymptotics, Large and Moderate Deviations

In this paper, we revisit the discrete lossy Gray-Wyner problem. In particular, we derive its optimal second-order coding rate region, its error exponent (reliability function), and its moderate deviations constant under mild conditions on the source. To obtain the second-order asymptotics, we extend some ideas from Watanabe’s work. In particular, we leverage the properties of an appropriate generalization of the conditional distortion-tilted information density, which was first introduced by Kostina and Verdú. The converse part uses a perturbation argument by Gu and Effros in their strong converse proof of the discrete Gray–Wyner problem. The achievability part uses two novel elements: 1) a generalization of various type covering lemmas and 2) the uniform continuity of the conditional rate-distortion function in both the source (joint) distribution and the distortion level. To obtain the error exponent, for the achievability part, we use the same generalized type covering lemma, and for the converse, we use the strong converse together with a change-of-measure technique. Finally, to obtain the moderate deviations constant, we apply the moderate deviations theorem to probabilities defined in terms of information spectrum quantities.

[1]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[2]  Shunsuke Ihara Error Exponent for Coding of Memoryless Gaussian Sources with a Fidelity Criterion , 2000 .

[3]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[4]  Vincent Yan Fu Tan,et al.  Nonasymptotic and Second-Order Achievability Bounds for Coding With Side-Information , 2013, IEEE Transactions on Information Theory.

[5]  Shun Watanabe,et al.  Second-Order Region for Gray–Wyner Network , 2015, IEEE Transactions on Information Theory.

[6]  E. Ordentlich,et al.  Inequalities for the L1 Deviation of the Empirical Distribution , 2003 .

[7]  Sergio Verdú,et al.  A new converse in rate-distortion theory , 2012, 2012 46th Annual Conference on Information Sciences and Systems (CISS).

[8]  Jun Chen,et al.  On the Redundancy of Slepian–Wolf Coding , 2009, IEEE Transactions on Information Theory.

[9]  Kenneth Rose,et al.  Error exponents in scalable source coding , 2003, IEEE Trans. Inf. Theory.

[10]  Ioannis Kontoyiannis,et al.  Pointwise redundancy in lossy data compression and universal lossy data compression , 2000, IEEE Trans. Inf. Theory.

[11]  Kenneth Rose,et al.  The Lossy Common Information of Correlated Sources , 2014, IEEE Transactions on Information Theory.

[12]  Tsachy Weissman,et al.  Strong Successive Refinability and Rate-Distortion-Complexity Tradeoff , 2015, IEEE Transactions on Information Theory.

[13]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[14]  Sergio Verdú,et al.  Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.

[15]  Aaron B. Wagner,et al.  Moderate deviation analysis of channel coding: Discrete memoryless case , 2010, 2010 IEEE International Symposium on Information Theory.

[16]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[17]  Aaron B. Wagner,et al.  Refinement of the Random Coding Bound , 2012, IEEE Transactions on Information Theory.

[18]  Ioannis Kontoyiannis,et al.  Lossless compression with moderate error probability , 2013, 2013 IEEE International Symposium on Information Theory.

[19]  Anant Sahai,et al.  On the uniform continuity of the rate-distortion function , 2008, 2008 IEEE International Symposium on Information Theory.

[20]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[21]  T. Cover,et al.  Rate Distortion Theory , 2001 .

[22]  Vincent Y. F. Tan,et al.  Moderate-deviations of lossy source coding for discrete and Gaussian sources , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[23]  Lizhong Zheng,et al.  Euclidean Information Theory , 2008, 2008 IEEE International Zurich Seminar on Communications.

[24]  Victoria Kostina,et al.  Lossy data compression: Nonasymptotic fundamental limits , 2013 .

[25]  Yuval Kochman,et al.  The Dispersion of Lossy Source Coding , 2011, 2011 Data Compression Conference.

[26]  Aaron B. Wagner,et al.  Refinement of the Sphere-Packing Bound: Asymmetric Channels , 2012, IEEE Transactions on Information Theory.

[27]  Y F TanVincent Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities , 2014 .

[28]  Vincent Yan Fu Tan,et al.  Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities , 2014, Found. Trends Commun. Inf. Theory.

[29]  Vincent Yan Fu Tan,et al.  Second-Order Coding Rates for Channels With State , 2014, IEEE Transactions on Information Theory.

[30]  Wei Liu,et al.  A Lossy Source Coding Interpretation of Wyner’s Common Information , 2016, IEEE Transactions on Information Theory.

[31]  S. Verdú,et al.  Channel dispersion and moderate deviations limits for memoryless channels , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[32]  Jun Chen,et al.  On the Redundancy-Error Tradeoff in Slepian-Wolf Coding and Channel Coding , 2007, 2007 IEEE International Symposium on Information Theory.

[33]  Prakash Narayan,et al.  Error exponents for successive refinement by partitioning , 1996, IEEE Trans. Inf. Theory.

[34]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[35]  Amin Gohari,et al.  A technique for deriving one-shot achievability results in network information theory , 2013, 2013 IEEE International Symposium on Information Theory.

[36]  Michelle Effros,et al.  A strong converse for a collection of network source coding problems , 2009, 2009 IEEE International Symposium on Information Theory.

[37]  Robert M. Gray,et al.  Source coding for a simple network , 1974 .

[38]  Aaron B. Wagner,et al.  Moderate Deviations in Channel Coding , 2012, IEEE Transactions on Information Theory.

[39]  Mehul Motani,et al.  Second-order coding region for the discrete lossy Gray-Wyner source coding problem , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[40]  Aaron B. Wagner,et al.  Refinement of the sphere-packing bound , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[41]  Thomas M. Cover,et al.  Network Information Theory , 2001 .