On information rates for mismatched decoders

Reliable transmission over a discrete-time memoryless channel with a decoding metric that is not necessarily matched to the channel (mismatched decoding) is considered. It is assumed that the encoder knows both the true channel and the decoding metric. The lower bound on the highest achievable rate found by Csiszar and Korner (1981) and by Hui (1983) for DMC's, hereafter denoted C/sub LM/, is shown to bear some interesting information-theoretic meanings. The bound C/sub LM/ turns out to be the highest achievable rate in the random coding sense, namely, the random coding capacity for mismatched decoding. It is also demonstrated that the /spl epsiv/-capacity associated with mismatched decoding cannot exceed C/sub LM/. New bounds and some properties of C/sub LM/ are established and used to find relations to the generalized mutual information and to the generalized cutoff rate. The expression for C/sub LM/ is extended to a certain class of memoryless channels with continuous input and output alphabets, and is used to calculate C/sub LM/ explicitly for several examples of theoretical and practical interest. Finally, it is demonstrated that in contrast to the classical matched decoding case, here, under the mismatched decoding regime, the highest achievable rate depends on whether the performance criterion is the bit error rate or the message error probability and whether the coding strategy is deterministic or randomized. >

[1]  L. H. Brandenburg,et al.  Capacity of the Gaussian channel with memory: The multivariate case , 1974 .

[2]  Jacob Ziv,et al.  Universal decoding for finite-state channels , 1985, IEEE Trans. Inf. Theory.

[3]  J. Omura,et al.  Coded Error Probability Evaluation for Antijam Communication Systems , 1982, IEEE Trans. Commun..

[4]  Jim K. Omura,et al.  Decoding with approximate channel statistics for bandlimited nonlinear satellite channels , 1981, IEEE Trans. Inf. Theory.

[5]  Ephraim Zehavi,et al.  Decoding under integer metrics constraints , 1995, IEEE Trans. Commun..

[6]  Joseph Y. N. Hui,et al.  Fundamental issues of multiple accessing , 1983 .

[7]  Imre Csiszár,et al.  Arbitrarily varying channels with general alphabets and states , 1992, IEEE Trans. Inf. Theory.

[8]  C. Shannon Probability of error for optimal codes in a Gaussian channel , 1959 .

[9]  E. L. Lehmann,et al.  Theory of point estimation , 1950 .

[10]  I. Bar-David,et al.  Design Criteria for Noncoherent Gaussian Channels with MFSK Signaling and Coding , 1976, IEEE Trans. Commun..

[11]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[12]  Shlomo Shamai,et al.  Performance bounds and cutoff rates of quantum limited OOK with optical amplification , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[13]  G. Kaplan,et al.  On Information Rates for Mismatched Decoders , 1993, Proceedings. IEEE International Symposium on Information Theory.

[14]  Vladimir B. Balakirsky Coding Theorem for Discrete Memoryless Channels with Given Decision Rule , 1991, Algebraic Coding.

[15]  John M. Cioffi,et al.  Vector coding for partial response channels , 1990, IEEE Trans. Inf. Theory.

[16]  Imre Csiszár Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.

[17]  Rudolf Ahlswede,et al.  Erasure, list, and detection zero-error capacities for low noise and a relation to identification , 1996, IEEE Trans. Inf. Theory.

[18]  Jacob Wolfowitz Coding Theorems of Information Theory , 1962 .

[19]  Brian L. Hughes,et al.  Exponential error bounds for random codes on Gaussian arbitrarily varying channels , 1991, IEEE Trans. Inf. Theory.

[20]  Richard E. Blahut,et al.  Principles and practice of information theory , 1987 .

[21]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[22]  Imre Csiszár,et al.  Channel capacity for a given decoding metric , 1995, IEEE Trans. Inf. Theory.

[23]  Evaggelos Geraniotis,et al.  Minimax robust coding for channels with uncertainty statistics , 1985, IEEE Trans. Inf. Theory.

[24]  Gideon Kaplan ON INFORMATION RATES OF COMPOUND CHANNELS (With Application to Antipodal Signaling in a Fading Environment) , 1980 .

[25]  A. D. Wyner A Bound on the Number of Distinguishable Functions which are Time-Limited and Approximately Band-Limited , 1973 .

[26]  Stephen P. Boyd,et al.  On optimal signal sets for digital communications with finite precision and amplitude constraints , 1991, IEEE Trans. Commun..

[27]  Amos Lapidoth,et al.  Mismatched decoding and the multiple-access channel , 1994, IEEE Trans. Inf. Theory.

[28]  Thomas M. Fischer Some Remarks on the Role of Inaccuracy in Shannon’s Theory of Information Transmission , 1978 .

[29]  R. Gallager Information Theory and Reliable Communication , 1968 .

[30]  M. E. Arutyunyan BOUNDS ON E-CAPACITY OF A CHANNEL WITH RANDOM PARAMETER , 1991 .

[31]  William L. Root,et al.  Estimates of Epsilon capacity for certain linear communication channels , 1968, IEEE Trans. Inf. Theory.

[32]  Irvin G. Stiglitz,et al.  A coding theorem for a class of unknown channels , 1967, IEEE Trans. Inf. Theory.