Non-Asymptotic Converse Bounds and Refined Asymptotics for Two Source Coding Problems

In this paper, we revisit two multi-terminal lossy source coding problems: the lossy source coding problem with side information available at the encoder and one of the two decoders, which we term as the Kaspi problem (Kaspi, 1994), and the multiple description coding problem with one semi-deterministic distortion measure, which we refer to as the Fu-Yeung problem (Fu and Yeung, 2002). For the Kaspi problem, we first present the properties of optimal test channels. Subsequently, we generalize the notion of the distortion-tilted information density for the lossy source coding problem to the Kaspi problem and prove a non-asymptotic converse bound using the properties of optimal test channels and the well-defined distortion-tilted information density. Finally, for discrete memoryless sources, we derive refined asymptotics which includes the second-order, large, and moderate deviations asymptotics. In the converse proof of second-order asymptotics, we apply the Berry-Esseen theorem to the derived non-asymptotic converse bound. The achievability proof follows by first proving a type-covering lemma tailored to the Kaspi problem, then properly Taylor expanding the well-defined distortion-tilted information densities and finally applying the Berry-Esseen theorem. We then generalize the methods used in the Kaspi problem to the Fu-Yeung problem. As a result, we obtain the properties of optimal test channels for the minimum sum-rate function, a non-asymptotic converse bound and refined asymptotics for discrete memoryless sources. Since the successive refinement problem is a special case of the Fu-Yeung problem, as a by-product, we obtain a non-asymptotic converse bound for the successive refinement problem, which is a strict generalization of the non-asymptotic converse bound for successively refinable sources (Zhou, Tan, and Motani, 2017).

[1]  Toby Berger,et al.  Failure of successive refinement for symmetric Gaussian mixtures , 1997, IEEE Trans. Inf. Theory.

[2]  Mehul Motani,et al.  Second-Order and Moderate Deviations Asymptotics for Successive Refinement , 2016, IEEE Transactions on Information Theory.

[3]  Amin Gohari,et al.  A technique for deriving one-shot achievability results in network information theory , 2013, 2013 IEEE International Symposium on Information Theory.

[4]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[5]  Sergio Verdú,et al.  A new converse in rate-distortion theory , 2012, 2012 46th Annual Conference on Information Sciences and Systems (CISS).

[6]  Amiram H. Kaspi,et al.  Rate-distortion function when side-information may be present at the decoder , 1994, IEEE Trans. Inf. Theory.

[7]  E. Telatar,et al.  The Kaspi Rate-Distortion Problem with Encoder Side-Information: Binary Erasure Case , 2006 .

[8]  Toby Berger,et al.  New results in binary multiple descriptions , 1987, IEEE Trans. Inf. Theory.

[9]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[10]  L. Ozarow,et al.  On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.

[11]  S. Sandeep Pradhan,et al.  On the Role of Feedforward in Gaussian Sources: Point-to-Point Source Coding and Multiple Description Source Coding , 2007, IEEE Transactions on Information Theory.

[12]  Vincent Y. F. Tan,et al.  Moderate-deviations of lossy source coding for discrete and Gaussian sources , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[13]  A. Kanlis,et al.  Error exponents for successive refinement by partitioning , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[14]  Mehul Motani,et al.  Second-order coding region for the discrete successive refinement source coding problem , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[15]  Vivek K. Goyal,et al.  Multiple description coding with many channels , 2003, IEEE Trans. Inf. Theory.

[16]  Abbas El Gamal,et al.  Achievable rates for multiple descriptions , 1982, IEEE Trans. Inf. Theory.

[17]  Shunsuke Ihara Error Exponent for Coding of Memoryless Gaussian Sources with a Fidelity Criterion , 2000 .

[18]  Ram Zarnir Gaussian Codes and Shannon Bounds for Multiple Descriptions , 1998 .

[19]  Mehul Motani,et al.  Discrete Lossy Gray–Wyner Revisited: Second-Order Asymptotics, Large and Moderate Deviations , 2015, IEEE Transactions on Information Theory.

[20]  Tsachy Weissman,et al.  Strong Successive Refinability and Rate-Distortion-Complexity Tradeoff , 2015, IEEE Transactions on Information Theory.

[21]  Yuval Kochman,et al.  The dispersion of joint source-channel coding , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[22]  Ertem Tuncel,et al.  The rate-distortion function for successive refinement of abstract sources , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[23]  Mehul Motani,et al.  Second-Order Coding Rates for Conditional Rate-Distortion , 2014, ArXiv.

[24]  Vincent Yan Fu Tan,et al.  Nonasymptotic and Second-Order Achievability Bounds for Coding With Side-Information , 2013, IEEE Transactions on Information Theory.

[25]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[26]  Rudolf Ahlswede,et al.  On multiple descriptions and team guessing , 1986, IEEE Trans. Inf. Theory.

[27]  Shun Watanabe,et al.  Second-Order Region for Gray–Wyner Network , 2015, IEEE Transactions on Information Theory.

[28]  H. Vincent Poor,et al.  Channel coding: non-asymptotic fundamental limits , 2010 .

[29]  Ramji Venkataramanan,et al.  Multiple descriptions with feed-forward: A single-letter achievable rate region , 2008, 2008 IEEE International Symposium on Information Theory.

[30]  Suhas N. Diggavi,et al.  Lossy source coding with Gaussian or erased side-information , 2009, 2009 IEEE International Symposium on Information Theory.

[31]  Lizhong Zheng,et al.  Euclidean Information Theory , 2008, 2008 IEEE International Zurich Seminar on Communications.

[32]  S. Verdú,et al.  Channel dispersion and moderate deviations limits for memoryless channels , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[33]  Masahito Hayashi,et al.  Information Spectrum Approach to Second-Order Coding Rate in Channel Coding , 2008, IEEE Transactions on Information Theory.

[34]  Victoria Kostina,et al.  Lossy data compression: Nonasymptotic fundamental limits , 2013 .

[35]  Yuval Kochman,et al.  The Dispersion of Lossy Source Coding , 2011, 2011 Data Compression Conference.

[36]  Bixio Rimoldi,et al.  Successive refinement of information: characterization of the achievable rates , 1994, IEEE Trans. Inf. Theory.

[37]  S. Diggavi,et al.  A Calculation of the Heegard-Berger Rate-distortion Function for a Binary Source , 2006, 2006 IEEE Information Theory Workshop - ITW '06 Chengdu.

[38]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .

[39]  Aaron B. Wagner,et al.  Moderate Deviations in Channel Coding , 2012, IEEE Transactions on Information Theory.

[40]  Aaron D. Wyner,et al.  On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.

[41]  E. Telatar,et al.  The Kaspi Rate-Distortion Problem with Encoder Side-Information: Gaussian Case , 2005 .

[42]  Mehul Motani,et al.  Kaspi Problem Revisited: Non-Asymptotic Converse Bound and Second-Order Asymptotics , 2017, GLOBECOM 2017 - 2017 IEEE Global Communications Conference.

[43]  H. S. Witsenhausen,et al.  B.S.T.J. brief: On source networks with minimal breakdown degradation , 1980, The Bell System Technical Journal.

[44]  Vincent Yan Fu Tan,et al.  Second-Order Coding Rates for Channels With State , 2014, IEEE Transactions on Information Theory.

[45]  Masahito Hayashi,et al.  Second-Order Asymptotics in Fixed-Length Source Coding and Intrinsic Randomness , 2005, IEEE Transactions on Information Theory.

[46]  Mehul Motani,et al.  On the Multiple Description Coding Problem with One Semi-Deterministic Distortion Measure , 2017, GLOBECOM 2017 - 2017 IEEE Global Communications Conference.

[47]  Rudolf Ahlswede,et al.  Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.

[48]  R. Yeung,et al.  On the rate-distortion region for multiple descriptions , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[49]  Kenneth Rose,et al.  Computation and analysis of the N-Layer scalable rate-distortion function , 2003, IEEE Trans. Inf. Theory.

[50]  Sergio Verdú,et al.  Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.

[51]  A. Wyner,et al.  Source coding for multiple descriptions , 1980, The Bell System Technical Journal.

[52]  S. Varadhan,et al.  Asymptotic evaluation of certain Markov process expectations for large time , 1975 .

[53]  Vincent Yan Fu Tan,et al.  Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities , 2014, Found. Trends Commun. Inf. Theory.

[54]  Toby Berger,et al.  Rate distortion when side information may be absent , 1985, IEEE Trans. Inf. Theory.

[55]  Robert M. Gray,et al.  Source coding for a simple network , 1974 .