The error exponent of random gilbert-varshamov codes

We consider transmission over a discrete memoryless channel (DMC) W(y\x) with finite alphabets X and Y. It is assumed that an (n, M<inf>n</inf>)-codebook M<inf>n</inf> = [x<inf>1</inf>,…, x<inf>Mn</inf>} with rate R<inf>n</inf> = 1/n log M<inf>n</inf> is used for transmission. The type-dependent maximum-metric decoder estimates the transmitted message as m = arg max<inf>xi∊Mn</inf> q(P<inf>xi, y</inf>), (1) where <inf>xy</inf> is the joint empirical distribution [1, Ch. 2] of the pair (x, y) and the metric q : P(X × Y) → R is continuous. Maximum-likelihood (ML) decoding is a special case of (1), but the decoder may in general be mismatched [2], [3].

[1]  E. Gilbert A comparison of signalling alphabets , 1952 .

[2]  Imre Csiszár,et al.  Channel capacity for a given decoding metric , 1995, IEEE Trans. Inf. Theory.

[3]  G. Kaplan,et al.  On Information Rates for Mismatched Decoders , 1993, Proceedings. IEEE International Symposium on Information Theory.

[4]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[5]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[6]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.