Large Deviations Behavior of the Logarithmic Error Probability of Random Codes

This work studies the deviations of the error exponent of the constant composition code ensemble around its expectation, known as the error exponent of the typical random code (TRC). In particular, it is shown that the probability of randomly drawing a codebook whose error exponent is smaller than the TRC exponent is exponentially small; upper and lower bounds for this exponent are given, which coincide in some cases. In addition, the probability of randomly drawing a codebook whose error exponent is larger than the TRC exponent is shown to be double–exponentially small; upper and lower bounds to the double–exponential exponent are given. The results suggest that codebooks whose error exponent is larger than the error exponent of the TRC are extremely rare. The key ingredient in the proofs is a new large deviations result of type class enumerators with dependent variables.

[1]  Neri Merhav Exact Random Coding Error Exponents of Optimal Bin Index Decoding , 2014, IEEE Transactions on Information Theory.

[2]  Mill Johannes G.A. Van,et al.  Transmission Of Information , 1961 .

[3]  Svante Janson,et al.  New versions of Suen's correlation inequality , 1998, Random Struct. Algorithms.

[4]  Neri Merhav,et al.  Error Exponents of Typical Random Codes , 2017, 2018 IEEE International Symposium on Information Theory (ISIT).

[5]  Amiel Feinstein,et al.  Error bounds in noisy channels without memory , 1955, IRE Trans. Inf. Theory.

[6]  Neri Merhav,et al.  A Lagrange–Dual Lower Bound to the Error Exponent of the Typical Random Code , 2020, IEEE Transactions on Information Theory.

[7]  Neri Merhav,et al.  Error Exponents of Typical Random Trellis Codes , 2019, IEEE Transactions on Information Theory.

[8]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[9]  Sergio Verdú,et al.  On α-decodability and α-likelihood decoder , 2017, 2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[10]  Neri Merhav,et al.  Error Exponents of Typical Random Codes for the Colored Gaussian Channel , 2018, 2019 IEEE International Symposium on Information Theory (ISIT).

[11]  Achilleas Anastasopoulos,et al.  Typicality Graphs:Large Deviation Analysis , 2010, ArXiv.

[12]  Neri Merhav,et al.  Statistical Physics and Information Theory , 2010, Found. Trends Commun. Inf. Theory.

[13]  Rudolf Ahlswede,et al.  Good codes can be produced by a few permutations , 1982, IEEE Trans. Inf. Theory.

[14]  Robert G. Gallager,et al.  A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.

[15]  Alexander Barg,et al.  Random codes: Minimum distances and error exponents , 2002, IEEE Trans. Inf. Theory.

[16]  Neri Merhav,et al.  Channel Detection in Coded Communication , 2015, IEEE Transactions on Information Theory.

[17]  Noga Alon,et al.  The Probabilistic Method , 2015, Fundamentals of Ramsey Theory.

[18]  Neri Merhav,et al.  Exact Random Coding Exponents for Erasure Decoding , 2011, IEEE Transactions on Information Theory.

[19]  Neri Merhav,et al.  The generalized stochastic likelihood decoder: Random coding and expurgated bounds , 2015, 2016 IEEE International Symposium on Information Theory (ISIT).

[20]  Achilleas Anastasopoulos,et al.  Error Exponent for Multiple Access Channels: Upper Bounds , 2015, IEEE Transactions on Information Theory.

[21]  Santhosh Kumar,et al.  Comparing the bit-MAP and block-MAP decoding thresholds of reed-muller codes on BMS channels , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[22]  Robert M. Gray,et al.  Coding for noisy channels , 2011 .