On the capacity of some channels with channel state information

We study the capacity of some channels whose conditional output probability distribution depends on a state process independent of the channel input and where channel state information (CSI) signals are available both at the transmitter (CSIT) and at the receiver (CSIR). When the channel state and the CSI signals are jointly independent and identically distributed (i.i.d.), the channel reduces to a case studied by Shannon (1958). In this case, we show that when the CSIT is a deterministic function of the CSIR, optimal coding is particularly simple. When the state process has memory, we provide a general capacity formula and we give some more restrictive conditions under which the capacity has still a simple single-letter characterization, allowing simple optimal coding. Finally, we turn to the additive white Gaussian noise (AWGN) channel with fading and we provide a generalization of some results about capacity with CSI for this channel. In particular, we show that variable-rate coding (or multiplexing of several codebooks) is not needed to achieve capacity and, even when the CSIT is not perfect, the capacity achieving power allocation is of the waferfilling type.

[1]  Shlomo Shamai,et al.  Error Exponents And Outage Probabilities For The Block-Fading Gaussian Channel , 1991, IEEE International Symposium on Personal, Indoor and Mobile Radio Communications..

[2]  David Tse,et al.  Multiaccess Fading Channels-Part I: Polymatroid Structure, Optimal Resource Allocation and Throughput Capacities , 1998, IEEE Trans. Inf. Theory.

[3]  Pravin Varaiya,et al.  Capacity of fading channels with channel side information , 1997, IEEE Trans. Inf. Theory.

[4]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[5]  Jacob Wolfowitz Coding Theorems of Information Theory , 1962 .

[6]  J. Wolfowitz Coding Theorems of Information Theory , 1962, Ergebnisse der Mathematik und Ihrer Grenzgebiete.

[7]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[8]  Claude E. Shannon,et al.  Channels with Side Information at the Transmitter , 1958, IBM J. Res. Dev..

[9]  A. Goldsmith,et al.  Capacity of time-varying channels with channel side information , 1997, Proceedings of IEEE International Symposium on Information Theory.

[10]  Pravin Varaiya,et al.  Capacity, mutual information, and coding for finite-state Markov channels , 1996, IEEE Trans. Inf. Theory.

[11]  James S. Harris,et al.  Tables of integrals , 1998 .

[12]  Thomas H. E. Ericson,et al.  A Gaussian channel with slow fading (Corresp.) , 1970, IEEE Trans. Inf. Theory.

[13]  Frederick Jelinek,et al.  Indecomposable Channels with Side Information at the Transmitter , 1965, Inf. Control..

[14]  S. Shamai,et al.  The capacity of discrete-time Rayleigh fading channels , 1997, Proceedings of IEEE International Symposium on Information Theory.

[15]  D. F. Hays,et al.  Table of Integrals, Series, and Products , 1966 .

[16]  Israel Bar-David,et al.  Capacity and coding for the Gilbert-Elliot channels , 1989, IEEE Trans. Inf. Theory.

[17]  Masoud Salehi,et al.  Optimal quantization for finite-state channels , 1997, IEEE Trans. Inf. Theory.

[18]  Jacob Wolfowitz,et al.  Channels with Arbitrarily Varying Channel Probability Functions , 1962, Inf. Control..

[19]  RUDOLF AHLSWEDE Arbitrarily varying channels with states sequence known to the sender , 1986, IEEE Trans. Inf. Theory.

[20]  E. Gilbert Capacity of a burst-noise channel , 1960 .

[21]  Uri Erez,et al.  Noise prediction for channel coding with side information at the transmitter , 1998, Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).

[22]  Andrea J. Goldsmith,et al.  Variable-rate variable-power MQAM for fading channels , 1997, IEEE Trans. Commun..

[23]  David Tse,et al.  Multiaccess Fading Channels-Part II: Delay-Limited Capacities , 1998, IEEE Trans. Inf. Theory.

[24]  Daniel Cygan,et al.  The land mobile satellite communication channel-recording, statistics, and channel model , 1991 .

[25]  Toby Berger,et al.  The quadratic Gaussian CEO problem , 1997, IEEE Trans. Inf. Theory.

[26]  M. Schwartz,et al.  Communication Systems and Techniques , 1996, IEEE Communications Magazine.

[27]  Raymond Knopp,et al.  Information capacity and power control in single-cell multiuser communications , 1995, Proceedings IEEE International Conference on Communications ICC '95.

[28]  M. Salehi Capacity and coding for memories with real-time noisy defect information at encoder and decoder , 1992 .

[29]  S. Shamai,et al.  Error probabilities for the block-fading Gaussian channel , 1995 .

[30]  J. Larrea-Arrieta,et al.  Adaptive forward error control schemes in channels with side information at the transmitter , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[31]  Wayne E. Stark,et al.  Channels with block interference , 1984, IEEE Trans. Inf. Theory.

[32]  Martin Vetterli,et al.  Source coding and transmission of signals over time-varying channels with side information , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[33]  Abbas El Gamal,et al.  On the capacity of computer memory with defects , 1983, IEEE Trans. Inf. Theory.

[34]  Giuseppe Caire,et al.  Optimum power control over fading channels , 1999, IEEE Trans. Inf. Theory.

[35]  Frederick Jelinek,et al.  Determination of Capacity Achieving Input Probabilities for a Class of Finite State Channels with Side Information , 1966, Inf. Control..

[36]  Uri Erez,et al.  Capacity and Coding for Symmetric Channels with Side Information at the Transmitter , 1997 .

[37]  Pravin Varaiya,et al.  Increasing spectral efficiency through power control , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[38]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .

[39]  Shlomo Shamai,et al.  Information theoretic considerations for cellular mobile radio , 1994 .