Error exponents of modulo-additive noise channels with side information at the transmitter

Consider the optimum strategy for using channel state ("side") information in transmission over a modulo-additive noise channel, with state-dependent noise, where the receiver does not have access to the side information (SI). Previous work showed that capacity-wise, the optimum transmitter shifts each code letter by a "prediction" of the noise sample based on the SI. We show that this structure achieves also the random-coding error exponent, and, therefore, is optimum at some range of rates below capacity. Specifically, the optimum transmitter predictor minimizes the Renyi entropy of the prediction error; the Renyi order depends on the rate, and goes to one (corresponding to Shannon entropy) for rates close to capacity. In contrast, it is shown that this "prediction strategy" may not be optimal at low transmission rates.

[1]  Peter Elias,et al.  Predictive coding-II , 1955, IRE Trans. Inf. Theory.

[2]  Max H. M. Costa,et al.  Writing on dirty paper , 1983, IEEE Trans. Inf. Theory.

[3]  Frans M. J. Willems Signaling for the Gaussian channel with side information at the transmitter , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[4]  Frederick Jelinek,et al.  Indecomposable Channels with Side Information at the Transmitter , 1965, Inf. Control..

[5]  G. David Forney,et al.  Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.

[6]  I. Csiszár Generalized Cutoff Rates and Renyi's Information Measures , 1993, Proceedings. IEEE International Symposium on Information Theory.

[7]  Aaron D. Wyner,et al.  Channels with Side Information at the Transmitter , 1993 .

[8]  R. Gallager Information Theory and Reliable Communication , 1968 .

[9]  Uri Erez,et al.  Noise prediction for channel coding with side information at the transmitter , 1998, Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).

[10]  Fady Alajaji,et al.  The capacity-cost function of discrete additive noise channels with and without feedback , 2000, IEEE Trans. Inf. Theory.

[11]  Wayne E. Stark,et al.  Channels with block interference , 1984, IEEE Trans. Inf. Theory.

[12]  Peter Elias,et al.  Predictive coding-I , 1955, IRE Trans. Inf. Theory.

[13]  Abbas El Gamal,et al.  On the capacity of computer memory with defects , 1983, IEEE Trans. Inf. Theory.

[14]  Uri Erez,et al.  Noise prediction for channels with side information at the transmitter , 2000, IEEE Trans. Inf. Theory.

[15]  Imre Csiszár Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.