Error Exponents of Typical Random Codes

We define the error exponent of the typical random code as the long-block limit of the negative normalized expectation of the logarithm of the error probability of the random code, as opposed to the traditional random coding error exponent, which is the limit of the negative normalized logarithm of the expectation of the error probability. For the ensemble of uniformly randomly drawn fixed composition codes, we provide exact error exponents of typical random codes for a general discrete memoryless channel (DMC) and a wide class of (stochastic) decoders, collectively referred to as the generalized likelihood decoder (GLD). This ensemble of fixed composition codes is shown to be no worse than any other ensemble of independent codewords that are drawn under a permutation-invariant distribution (e.g., i.i.d. codewords). We also present relationships between the error exponent of the typical random code and the ordinary random coding error exponent, as well as the expurgated exponent for the GLD. Finally, we demonstrate that our analysis technique is applicable also to more general communication scenarios, such as list decoding (for fixed-size lists) as well as decoding with an erasure/list option in Forney's sense. All proofs appear in the full version of this paper, https://arxiv.org/pdf/708.07301.pdf

[1]  Neri Merhav,et al.  Statistical Physics and Information Theory , 2010, Found. Trends Commun. Inf. Theory.

[2]  Neri Merhav Ensemble Performance of Biometric Authentication Systems Based on Secret Key Generation , 2019, IEEE Trans. Inf. Theory.

[3]  Neri Merhav,et al.  The generalized stochastic likelihood decoder: Random coding and expurgated bounds , 2015, 2016 IEEE International Symposium on Information Theory (ISIT).

[4]  Albert Guillén i Fàbregas,et al.  The likelihood decoder: Error exponents and mismatch , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[5]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[6]  Amin Gohari,et al.  A technique for deriving one-shot achievability results in network information theory , 2013, 2013 IEEE International Symposium on Information Theory.

[7]  Yoshiyuki Kabashima,et al.  How could the replica method improve accuracy of performance assessment of channel coding? , 2008, ArXiv.

[8]  H. Vincent Poor,et al.  The Likelihood Encoder for Lossy Compression , 2014, IEEE Transactions on Information Theory.

[9]  G. David Forney,et al.  Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.

[10]  Achilleas Anastasopoulos,et al.  Error Exponent for Multiple Access Channels: Upper Bounds , 2015, IEEE Transactions on Information Theory.

[11]  Thierry Mora,et al.  Statistical mechanics of error exponents for error-correcting codes , 2006, Physical review. E, Statistical, nonlinear, and soft matter physics.

[12]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[13]  G. D. J. Forney,et al.  On exponential error bounds for random codes on the BSC , 2001 .

[14]  Yoshiyuki Kabashima,et al.  Average error exponent in Gallager low-density parity-check codes , 2002 .

[15]  Neri Merhav Correction to “The Generalized Stochastic Likelihood Decoder: Random Coding and Expurgated Bounds” , 2017, IEEE Transactions on Information Theory.

[16]  Alexander Barg,et al.  Random codes: Minimum distances and error exponents , 2002, IEEE Trans. Inf. Theory.

[17]  Ali Nazari Error Exponent for Discrete Memoryless Multiple-Access Channels. , 2011 .

[18]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[19]  Andrew J. Viterbi,et al.  Principles of Digital Communication and Coding , 1979 .