The likelihood decoder: Error exponents and mismatch

This paper studies likelihood decoding for channel coding over discrete memoryless channels. It is shown that the likelihood decoder recovers the same random-coding error exponents as the maximum-likelihood decoder for i.i.d. and constant-composition random codes. The role of mismatch in likelihood decoding is studied, and the notion of the mismatched likelihood decoder capacity is introduced. It is shown, both in the case of random coding and optimized codebooks, that the mismatched likelihood decoder can lead to strictly worse achievable rates and error exponents compared to the corresponding mismatched maximum-metric decoder.

[1]  Amiel Feinstein,et al.  A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.

[2]  William K. Wootters,et al.  A ‘Pretty Good’ Measurement for Distinguishing Quantum States , 1994 .

[3]  Joseph Y. N. Hui,et al.  Fundamental issues of multiple accessing , 1983 .

[4]  Alexander S. Holevo,et al.  Reliability function of general classical-Quantum channel , 1999, IEEE Trans. Inf. Theory.

[5]  Imre Csiszár,et al.  Channel capacity for a given decoding metric , 1995, IEEE Trans. Inf. Theory.

[6]  Amin Gohari,et al.  A technique for deriving one-shot achievability results in network information theory , 2013, 2013 IEEE International Symposium on Information Theory.

[7]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[8]  K Fan,et al.  Minimax Theorems. , 1953, Proceedings of the National Academy of Sciences of the United States of America.

[9]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[10]  R. Gallager Information Theory and Reliable Communication , 1968 .

[11]  Albert Guillén i Fàbregas,et al.  Mismatched multi-letter successive decoding for the multiple-access channel , 2014, 2014 IEEE International Symposium on Information Theory.

[12]  Junan Zhu,et al.  Statistical Physics and Information Theory Perspectives on Linear Inverse Problems , 2017, ArXiv.

[13]  Neri Merhav,et al.  Exact Random Coding Exponents for Erasure Decoding , 2011, IEEE Transactions on Information Theory.

[14]  G. Kaplan,et al.  On Information Rates for Mismatched Decoders , 1993, Proceedings. IEEE International Symposium on Information Theory.

[15]  Emre Telatar,et al.  Mismatched decoding revisited: General alphabets, channels with memory, and the wide-band limit , 2000, IEEE Trans. Inf. Theory.

[16]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[17]  Neri Merhav,et al.  Error Exponents of Erasure/List Decoding Revisited Via Moments of Distance Enumerators , 2007, IEEE Transactions on Information Theory.

[18]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[19]  H. Vincent Poor,et al.  The Likelihood Encoder for Lossy Compression , 2014, IEEE Transactions on Information Theory.

[20]  Albert Guillén i Fàbregas,et al.  Mismatched Decoding: Error Exponents, Second-Order Rates and Saddlepoint Approximations , 2013, IEEE Transactions on Information Theory.

[21]  Nadav Shulman,et al.  Communication over an unknown channel via common broadcasting , 2003 .

[22]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.