The Dispersion of Universal Joint Source-Channel Coding for Arbitrary Sources and Additive Channels

We consider a universal joint source channel coding (JSCC) scheme to transmit an arbitrary memoryless source over an arbitrary additive channel. We adopt an architecture that consists of Gaussian codebooks for both the source reproduction sequences and channel codewords. The natural minimum Euclidean distance encoder and decoder, however, need to be judiciously modified to ensure universality as well as to obtain the best (highest) possible communication rates. In particular, we consider the analogue of an unequal error (or message) protection scheme in which all sources are partitioned into disjoint power type classes. We also regularize the nearest neighbor decoder so an appropriate measure of the size of each power type class is taken into account in the decoding strategy. For such an architecture, we derive ensemble tight second-order and moderate deviations results. Our first-order result generalizes seminal results by Lapidoth (1996, 1997). The dispersion of our JSCC scheme is a linear combination of the mismatched dispersions for the channel coding saddle-point problem by Scarlett, Tan and Durisi (2017) and the rate-distortion saddle-point problem by the present authors, thus also generalizing these results.

[1]  V. Tan,et al.  Refined Asymptotics for Rate-Distortion Using Gaussian Codebooks for Arbitrary Sources , 2017, IEEE Transactions on Information Theory.

[2]  Vincent Y. F. Tan,et al.  The dispersion of nearest-neighbor decoding for additive non-Gaussian channels , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[3]  Mehul Motani,et al.  Second-Order and Moderate Deviations Asymptotics for Successive Refinement , 2016, IEEE Transactions on Information Theory.

[4]  J. Nicholas Laneman,et al.  A Second-Order Achievable Rate Region for Gaussian Multi-Access Channels via a Central Limit Theorem for Functions , 2015, IEEE Transactions on Information Theory.

[5]  Oliver Kosut,et al.  Third-order coding rate for universal compression of Markov sources , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[6]  Vincent Yan Fu Tan,et al.  The third-order term in the normal approximation for the AWGN channel , 2014, 2014 IEEE International Symposium on Information Theory.

[7]  Vincent Yan Fu Tan,et al.  Asymmetric Evaluations of Erasure and Undetected Error Probabilities , 2014, IEEE Transactions on Information Theory.

[8]  Vincent Y. F. Tan,et al.  Unequal Message Protection: Asymptotic and Non-Asymptotic Tradeoffs , 2014, IEEE Transactions on Information Theory.

[9]  Aaron B. Wagner,et al.  Moderate Deviations in Channel Coding , 2012, IEEE Transactions on Information Theory.

[10]  S. Verdú,et al.  Lossy joint source-channel coding in the finite blocklength regime , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[11]  Yuval Kochman,et al.  The dispersion of joint source-channel coding , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[12]  Sergio Verdú,et al.  Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.

[13]  S. Verdú,et al.  Channel dispersion and moderate deviations limits for memoryless channels , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[14]  S. Verdú,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[15]  Amos Lapidoth,et al.  Nearest neighbor decoding for additive non-Gaussian noise channels , 1996, IEEE Trans. Inf. Theory.

[16]  A. Lapidoth On the role of mismatch in rate distortion theory , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[17]  Robert G. Gallager,et al.  The random coding bound is tight for the average code (Corresp.) , 1973, IEEE Trans. Inf. Theory.

[18]  C. Shannon Coding Theorems for a Discrete Source With a Fidelity Criterion-Claude , 2009 .