The Error Exponent of Generalized Random-Gilbert Varshamov Codes

We introduce a random code construction for channel coding in which the codewords are constrained to be well-separated according to a given distance function, analogously to an existing construction attaining the Gilbert-Varshamov bound. We derive an achievable error exponent for this construction, and prove its tightness with respect to the ensemble average. We show that the exponent recovers the Csiszár and Körner exponent as a special case by choosing the distance function to be the negative of the empirical mutual information. We further establish the optimality of this distance function with respect to the exponent of the random coding scheme.

[1]  Anelia Somekh-Baruch,et al.  Generalized Random Gilbert-Varshamov Codes , 2018, IEEE Transactions on Information Theory.

[2]  E. Gilbert A comparison of signalling alphabets , 1952 .

[3]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[4]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[5]  G. Kaplan,et al.  On Information Rates for Mismatched Decoders , 1993, Proceedings. IEEE International Symposium on Information Theory.

[6]  Imre Csiszár,et al.  Channel capacity for a given decoding metric , 1995, IEEE Trans. Inf. Theory.

[7]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .