Error Exponents of Erasure/List Decoding Revisited Via Moments of Distance Enumerators

The analysis of random coding error exponents pertaining to erasure/list decoding, due to Forney, is revisited. Instead of using Jensen's inequality as well as some other inequalities in the derivation, we demonstrate that an exponentially tight analysis can be carried out by assessing the relevant moments of certain distance enumerators. The resulting bound has the following advantages: (i) it is at least as tight as Forney's bound, (ii) under certain symmetry conditions associated with the channel and the random coding distribution, it is simpler than Forney's bound in the sense that it involves an optimization over one parameter only (rather than two), and (iii) in certain special cases, like the binary symmetric channel (BSC), the optimum value of this parameter can be found in closed form, and so, there is no need to conduct a numerical search. We have not found yet a numerical example where this new bound is strictly better than Forney's bound and this may provide an additional evidence to support Forney's conjecture that his bound is tight for the average code. However, when applying the proposed analysis technique to a certain universal decoder with erasures, we demonstrate that it may yield significantly tighter exponential error bounds. We believe that this technique can be useful in simplifying and improving exponential error bounds in other problem settings as well.

[1]  Alexander Barg,et al.  Random codes: Minimum distances and error exponents , 2002, IEEE Trans. Inf. Theory.

[2]  Neri Merhav,et al.  Minimax Universal Decoding With an Erasure Option , 2007, IEEE Transactions on Information Theory.

[3]  Neri Merhav,et al.  Error exponents of optimum decoding for the degraded broadcast channel using moments of type class enumerators , 2008, 2009 IEEE International Symposium on Information Theory.

[4]  E. Bolthausen,et al.  The Random Energy Model , 2002 .

[5]  Andrew J. Viterbi,et al.  Error bounds for the white Gaussian and other very noisy memoryless channels with generalized decision regions , 1969, IEEE Trans. Inf. Theory.

[6]  Neri Merhav,et al.  Relations Between Random Coding Exponents and the Statistical Physics of Random Codes , 2007, IEEE Transactions on Information Theory.

[7]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[8]  Takeshi Hashimoto,et al.  Composite Scheme LR + Th for Decoding with Erasures and Its Effective Equivalence to Forney's Rule , 1999, IEEE Trans. Inf. Theory.

[9]  Ruján Finite temperature error-correcting codes. , 1993, Physical review letters.

[10]  G. David Forney,et al.  Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.

[11]  Rudolf Ahlswede,et al.  Erasure, list, and detection zero-error capacities for low noise and a relation to identification , 1996, IEEE Trans. Inf. Theory.

[12]  Neri Merhav,et al.  Error Exponents of Optimum Decoding for the Interference Channel , 2010, IEEE Transactions on Information Theory.

[13]  Takeshi Hashimoto,et al.  Performance of explicit error detection and threshold decision in decoding with erasures , 1997, IEEE Trans. Inf. Theory.

[14]  Hesham El Gamal,et al.  On the Error Exponents of ARQ Channels With Deadlines , 2006, IEEE Transactions on Information Theory.