Non-Asymptotic Converse Bounds and Refined Asymptotics for Two Lossy Source Coding Problems

In this paper, we revisit two multi-terminal lossy source coding problems: the lossy source coding problem with side information available at the encoder and one of the two decoders, which we term as the Kaspi problem (Kaspi, 1994), and the multiple description coding problem with one semi-deterministic distortion measure, which we refer to as the Fu-Yeung problem (Fu and Yeung, 2002). For the Kaspi problem, we first present the properties of optimal test channels. Subsequently, we generalize the notion of the distortion-tilted information density for the lossy source coding problem to the Kaspi problem and prove a non-asymptotic converse bound using the properties of optimal test channels and the well-defined distortion-tilted information density. Finally, for discrete memoryless sources, we derive refined asymptotics which includes the second-order, large and moderate deviations asymptotics. In the converse proof of second-order asymptotics, we apply the Berry-Esseen theorem to the derived non-asymptotic converse bound. The achievability proof follows by first proving a type-covering lemma tailored to the Kaspi problem, then properly Taylor expanding the well-defined distortion-tilted information densities and finally applying the Berry-Esseen theorem. We then generalize the methods used in the Kaspi problem to the Fu-Yeung problem. As a result, we obtain the properties of optimal test channels for the minimum sum-rate function, a non-asymptotic converse bound and refined asymptotics for discrete memoryless sources. Since the successive refinement problem is a special case of the Fu-Yeung problem, as a by-product, we obtain a non-asymptotic converse bound for the successive refinement problem, which is a strict generalization of the non-asymptotic converse bound for successively refinable sources (Zhou, Tan and Motani, 2017).

[1]  Mehul Motani,et al.  Kaspi Problem Revisited: Non-Asymptotic Converse Bound and Second-Order Asymptotics , 2017, GLOBECOM 2017 - 2017 IEEE Global Communications Conference.

[2]  Toby Berger,et al.  Failure of successive refinement for symmetric Gaussian mixtures , 1997, IEEE Trans. Inf. Theory.

[3]  T. Berger Rate-Distortion Theory , 2003 .

[4]  Mehul Motani,et al.  On the Multiple Description Coding Problem with One Semi-Deterministic Distortion Measure , 2017, GLOBECOM 2017 - 2017 IEEE Global Communications Conference.

[5]  Mehul Motani,et al.  Second-Order and Moderate Deviations Asymptotics for Successive Refinement , 2016, IEEE Transactions on Information Theory.

[6]  Aaron B. Wagner,et al.  Moderate Deviations in Channel Coding , 2012, IEEE Transactions on Information Theory.

[7]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[8]  E. Telatar,et al.  The Kaspi Rate-Distortion Problem with Encoder Side-Information: Binary Erasure Case , 2006 .

[9]  L. Ozarow,et al.  On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.

[10]  Ram Zarnir Gaussian Codes and Shannon Bounds for Multiple Descriptions , 1998 .

[11]  S. Sandeep Pradhan,et al.  On the Role of Feedforward in Gaussian Sources: Point-to-Point Source Coding and Multiple Description Source Coding , 2007, IEEE Transactions on Information Theory.

[12]  Vincent Yan Fu Tan,et al.  Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities , 2014, Found. Trends Commun. Inf. Theory.

[13]  S. Varadhan,et al.  Asymptotic evaluation of certain Markov process expectations for large time , 1975 .

[14]  Amin Gohari,et al.  A technique for deriving one-shot achievability results in network information theory , 2013, 2013 IEEE International Symposium on Information Theory.

[15]  Amiram H. Kaspi,et al.  Rate-distortion function when side-information may be present at the decoder , 1994, IEEE Trans. Inf. Theory.

[16]  A. Kanlis,et al.  Error exponents for successive refinement by partitioning , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[17]  Tsachy Weissman,et al.  Strong Successive Refinability and Rate-Distortion-Complexity Tradeoff , 2015, IEEE Transactions on Information Theory.

[18]  Mehul Motani,et al.  Second-Order Coding Rates for Conditional Rate-Distortion , 2014, ArXiv.

[19]  Rudolf Ahlswede,et al.  Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.

[20]  R. Yeung,et al.  On the rate-distortion region for multiple descriptions , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[21]  Kenneth Rose,et al.  Computation and analysis of the N-Layer scalable rate-distortion function , 2003, IEEE Trans. Inf. Theory.

[22]  Ertem Tuncel,et al.  The rate-distortion function for successive refinement of abstract sources , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[23]  Michelle Effros,et al.  A strong converse for a collection of network source coding problems , 2009, 2009 IEEE International Symposium on Information Theory.

[24]  H. Vincent Poor,et al.  Channel coding: non-asymptotic fundamental limits , 2010 .

[25]  Vivek K. Goyal,et al.  Multiple description coding with many channels , 2003, IEEE Trans. Inf. Theory.

[26]  Abbas El Gamal,et al.  Achievable rates for multiple descriptions , 1982, IEEE Trans. Inf. Theory.

[27]  Ramji Venkataramanan,et al.  Multiple descriptions with feed-forward: A single-letter achievable rate region , 2008, 2008 IEEE International Symposium on Information Theory.

[28]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[29]  Shunsuke Ihara Error Exponent for Coding of Memoryless Gaussian Sources with a Fidelity Criterion , 2000 .

[30]  S. Verdú,et al.  Channel dispersion and moderate deviations limits for memoryless channels , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[31]  Masahito Hayashi,et al.  Information Spectrum Approach to Second-Order Coding Rate in Channel Coding , 2008, IEEE Transactions on Information Theory.

[32]  Sergio Verdú,et al.  A new converse in rate-distortion theory , 2012, 2012 46th Annual Conference on Information Sciences and Systems (CISS).

[33]  Toby Berger,et al.  Rate distortion when side information may be absent , 1985, IEEE Trans. Inf. Theory.

[34]  Robert M. Gray,et al.  Source coding for a simple network , 1974 .

[35]  Toby Berger,et al.  New results in binary multiple descriptions , 1987, IEEE Trans. Inf. Theory.

[36]  S. Diggavi,et al.  A Calculation of the Heegard-Berger Rate-distortion Function for a Binary Source , 2006, 2006 IEEE Information Theory Workshop - ITW '06 Chengdu.

[37]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .

[38]  Suhas N. Diggavi,et al.  Lossy source coding with Gaussian or erased side-information , 2009, 2009 IEEE International Symposium on Information Theory.

[39]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[40]  Shun Watanabe,et al.  Second-Order Region for Gray–Wyner Network , 2015, IEEE Transactions on Information Theory.

[41]  E. Telatar,et al.  The Kaspi Rate-Distortion Problem with Encoder Side-Information: Gaussian Case , 2005 .

[42]  Masahito Hayashi,et al.  Second-Order Asymptotics in Fixed-Length Source Coding and Intrinsic Randomness , 2005, IEEE Transactions on Information Theory.

[43]  H. S. Witsenhausen,et al.  B.S.T.J. brief: On source networks with minimal breakdown degradation , 1980, The Bell System Technical Journal.

[44]  Vincent Yan Fu Tan,et al.  Second-Order Coding Rates for Channels With State , 2014, IEEE Transactions on Information Theory.

[45]  Vincent Y. F. Tan,et al.  Moderate-deviations of lossy source coding for discrete and Gaussian sources , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[46]  Mehul Motani,et al.  Discrete Lossy Gray–Wyner Revisited: Second-Order Asymptotics, Large and Moderate Deviations , 2015, IEEE Transactions on Information Theory.

[47]  Lizhong Zheng,et al.  Euclidean Information Theory , 2008, 2008 IEEE International Zurich Seminar on Communications.

[48]  Victoria Kostina,et al.  Lossy data compression: Nonasymptotic fundamental limits , 2013 .

[49]  Yuval Kochman,et al.  The Dispersion of Lossy Source Coding , 2011, 2011 Data Compression Conference.

[50]  Bixio Rimoldi,et al.  Successive refinement of information: characterization of the achievable rates , 1994, IEEE Trans. Inf. Theory.

[51]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[52]  Aaron D. Wyner,et al.  On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.

[53]  Vincent Yan Fu Tan,et al.  Nonasymptotic and Second-Order Achievability Bounds for Coding With Side-Information , 2013, IEEE Transactions on Information Theory.

[54]  Rudolf Ahlswede,et al.  On multiple descriptions and team guessing , 1986, IEEE Trans. Inf. Theory.

[55]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[56]  Sergio Verdú,et al.  Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.

[57]  A. Wyner,et al.  Source coding for multiple descriptions , 1980, The Bell System Technical Journal.

[58]  Mehul Motani,et al.  Second-order coding region for the discrete successive refinement source coding problem , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).