Consider the optimum strategy for using channel state ("side") information in transmission over a modulo-additive noise channel, where the receiver does not have access to the side information. Previous work showed that capacity-wise, the optimum transmitter shifts each code letter by a "prediction" of the noise sample based on the side information. We show that this structure achieves also the random-coding error exponent, and therefore is optimum at some range of rates below capacity. Specifically, the optimum transmitter-predictor minimizes the Renyi entropy of the prediction error; the Renyi order depends on the rate, and goes to one (corresponding to Shannon (1958) entropy) for rates close to capacity. We also consider the problem of coding with side information at the transmitter subject to a power constraint.
[1]
Uri Erez,et al.
Noise prediction for channels with side information at the transmitter: error exponents
,
1999,
1999 Information Theory and Networking Workshop (Cat. No.99EX371).
[2]
Aaron D. Wyner,et al.
Channels with Side Information at the Transmitter
,
1993
.
[3]
R. Gallager.
Information Theory and Reliable Communication
,
1968
.
[4]
Max H. M. Costa,et al.
Writing on dirty paper
,
1983,
IEEE Trans. Inf. Theory.
[5]
Uri Erez,et al.
Noise prediction for channel coding with side information at the transmitter
,
1998,
Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).