Moderate Deviations in Channel Coding

We consider block codes whose rate converges to the channel capacity with increasing blocklength at a certain speed and examine the best possible decay of the probability of error. For discrete memoryless channels, we prove that a moderate deviation principle holds for all convergence rates between the large deviation and the central limit theorem regimes.

[1]  G. A. Barnard,et al.  Transmission of Information: A Statistical Theory of Communications. , 1961 .

[2]  Richard E. Blahut,et al.  Hypothesis testing and information theory , 1974, IEEE Trans. Inf. Theory.

[3]  H. Vincent Poor,et al.  Dispersion of Gaussian channels , 2009, 2009 IEEE International Symposium on Information Theory.

[4]  H. Vincent Poor,et al.  New channel coding achievability bounds , 2008, 2008 IEEE International Symposium on Information Theory.

[5]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[6]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .

[7]  Aaron B. Wagner,et al.  Refinement of the sphere packing bound for symmetric channels , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[8]  H. Vincent Poor,et al.  Dispersion of the Gilbert-Elliott Channel , 2009, IEEE Transactions on Information Theory.

[9]  C. Geiss,et al.  An introduction to probability theory , 2008 .

[10]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[11]  Aaron B. Wagner,et al.  Moderate deviation analysis of channel coding: Discrete memoryless case , 2010, 2010 IEEE International Symposium on Information Theory.

[12]  Vincent Y. F. Tan,et al.  On the dispersions of three network information theory problems , 2012, 2012 46th Annual Conference on Information Sciences and Systems (CISS).

[13]  P. A. P. Moran,et al.  An introduction to probability theory , 1968 .

[14]  Igal Sason,et al.  On Refined Versions of the Azuma-Hoeffding Inequality with Applications in Information Theory , 2011, ArXiv.

[15]  Jun Chen,et al.  On the Redundancy of Slepian–Wolf Coding , 2009, IEEE Transactions on Information Theory.

[16]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[17]  Aaron B. Wagner,et al.  Refinement of the sphere-packing bound , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[18]  Sergio Verdú,et al.  Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.

[19]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[20]  William Feller,et al.  Limit theorems for probabilities of large deviations , 1969 .

[21]  Vincent Y. F. Tan,et al.  Moderate-deviations of lossy source coding for discrete and Gaussian sources , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[22]  Vivek K. Goyal,et al.  Channels that die , 2009 .

[23]  Robert G. Gallager,et al.  A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.

[24]  Meir Feder,et al.  Finite-Dimensional Infinite Constellations , 2011, IEEE Transactions on Information Theory.

[25]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[26]  Aaron B. Wagner,et al.  Refinement of the Random Coding Bound , 2012, IEEE Transactions on Information Theory.

[27]  Igal Sason,et al.  Moderate deviations analysis of binary hypothesis testing , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[28]  Yuval Kochman,et al.  The Dispersion of Lossy Source Coding , 2011, 2011 Data Compression Conference.

[29]  H. Vincent Poor,et al.  Feedback in the Non-Asymptotic Regime , 2011, IEEE Transactions on Information Theory.

[30]  W. Feller Generalization of a probability limit theorem of Cramér , 1943 .

[31]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[32]  I. Ibragimov,et al.  Independent and stationary sequences of random variables , 1971 .

[33]  J. Wolfowitz The coding of messages subject to chance errors , 1957 .

[34]  Yuval Kochman,et al.  The dispersion of joint source-channel coding , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[35]  M.A. Khojastepour,et al.  How quickly can we approach channel capacity? , 2004, Conference Record of the Thirty-Eighth Asilomar Conference on Signals, Systems and Computers, 2004..

[36]  Elwyn R. Berlekamp,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..

[37]  S. Verdú,et al.  Channel dispersion and moderate deviations limits for memoryless channels , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[38]  Jun Chen,et al.  On the Redundancy-Error Tradeoff in Slepian-Wolf Coding and Channel Coding , 2007, 2007 IEEE International Symposium on Information Theory.

[39]  Masahito Hayashi,et al.  Information Spectrum Approach to Second-Order Coding Rate in Channel Coding , 2008, IEEE Transactions on Information Theory.

[40]  Vivek K Goyal,et al.  An Information-Theoretic Characterization of Channels That Die , 2012, IEEE Transactions on Information Theory.

[41]  En-hui Yang,et al.  On the Relationship between Redundancy and Decoding Error in Slepian-Wolf Coding , 2006, 2006 IEEE Information Theory Workshop - ITW '06 Chengdu.

[42]  En-Hui Yang,et al.  A Lower Bound for Variable Rate Slepian-Wolf Coding , 2006, 2006 IEEE International Symposium on Information Theory.

[43]  Aaron B. Wagner,et al.  Refinement of the Sphere-Packing Bound: Asymmetric Channels , 2012, IEEE Transactions on Information Theory.