Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels

A technique is proposed for the derivation of upper bounds on channel capacity. It is based on a dual expression for channel capacity where the maximization (of mutual information) over distributions on the channel input alphabet is replaced with a minimization (of average relative entropy) over distributions on the channel output alphabet. We also propose a technique for the analysis of the asymptotic capacity of cost-constrained channels. The technique is based on the observation that under fairly mild conditions capacity achieving input distributions "escape to infinity." The above techniques are applied to multiple-antenna flat-fading channels with memory where the realization of the fading process is unknown at the transmitter and unknown (or only partially known) at the receiver. It is demonstrated that, for high signal-to-noise ratio (SNR), the capacity of such channels typically grows only double-logarithmically in the SNR. To better understand this phenomenon and the rates at which it occurs, we introduce the fading number as the second-order term in the high-SNR asymptotic expansion of capacity, and derive estimates on its value for various systems. It is suggested that at rates that are significantly higher than the fading number, communication becomes extremely power inefficient, thus posing a practical limit on practically achievable rates. Upper and lower bounds on the fading number are also presented. For single-input-single-output (SISO) systems the bounds coincide, thus yielding a complete characterization of the fading number for general stationary and ergodic fading processes. We also demonstrate that for memoryless multiple-input single-output (MISO) channels, the fading number is achievable using beam-forming, and we derive an expression for the optimal beam direction. This direction depends on the fading law and is, in general, not the direction that maximizes the SNR on the induced SISO channel. Using a new closed-form expression for the expectation of the logarithm of a noncentral chi-square distributed random variable we provide some closed-form expressions for the fading number of some systems with Gaussian fading, including SISO systems with circularly symmetric stationary and ergodic Gaussian fading. The fading number of the latter is determined by the fading mean, fading variance, and the mean squared error in predicting the present fading from its past; it is not directly related to the Doppler spread. For the Rayleigh, Ricean, and multiple-antenna Rayleigh-fading channels we also present firm (nonasymptotic) upper and lower bounds on channel capacity. These bounds are asymptotically tight in the sense that their difference from capacity approaches zero at high SNR, and their ratio to capacity approaches one at low SNR.

[1]  Tamás Linder,et al.  On the asymptotic tightness of the Shannon lower bound , 1994, IEEE Trans. Inf. Theory.

[2]  W. J. Studden,et al.  Tchebycheff Systems: With Applications in Analysis and Statistics. , 1967 .

[3]  Thomas L. Marzetta,et al.  Capacity of a Mobile Multiple-Antenna Communication Link in Rayleigh Flat Fading , 1999, IEEE Trans. Inf. Theory.

[4]  Amos Lapidoth,et al.  Capacity Bounds Via Duality: A Phase Noise Example , 2002 .

[5]  Amos Lapidoth,et al.  On the Asymptotic Capacity of Fading Channels , 2003 .

[6]  Imre Csiszár,et al.  Arbitrarily varying channels with general alphabets and states , 1992, IEEE Trans. Inf. Theory.

[7]  Moshe Shaked,et al.  Stochastic orders and their applications , 1994 .

[8]  James L. Massey,et al.  Proper complex random processes with applications to information theory , 1993, IEEE Trans. Inf. Theory.

[9]  Anirvan M. Sengupta,et al.  Capacity of multivariate channels with multiplicative noise: I.Random matrix techniques and large-N expansions for full transfer matrices , 2000, physics/0010081.

[10]  Max H. M. Costa,et al.  A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.

[11]  G. Taricco,et al.  Capacity of fading channel with no side information , 1997 .

[12]  Shlomo Shamai,et al.  Fading channels: How perfect need "Perfect side information" be? , 2002, IEEE Trans. Inf. Theory.

[13]  I. Vajda Theory of statistical inference and information , 1989 .

[14]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[15]  Thomas H. E. Ericson,et al.  A Gaussian channel with slow fading (Corresp.) , 1970, IEEE Trans. Inf. Theory.

[16]  Neri Merhav,et al.  A strong version of the redundancy-capacity theorem of universal coding , 1995, IEEE Trans. Inf. Theory.

[17]  Amos Lapidoth,et al.  The Asymptotic Capacity of the Discrete-Time Poisson Channel , 2003 .

[18]  Helmut Bölcskei,et al.  Tight lower bounds on the ergodic capacity of Rayleigh fading MIMO channels , 2002, Global Telecommunications Conference, 2002. GLOBECOM '02. IEEE.

[19]  Amos Lapidoth On phase noise channels at high SNR , 2002, Proceedings of the IEEE Information Theory Workshop.

[20]  Pascal O. Vontobel,et al.  An upper bound on the capacity of channels with memory and constraint input , 2001, Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494).

[21]  Cédric Villani,et al.  A short proof of the "Concavity of entropy power" , 2000, IEEE Trans. Inf. Theory.

[22]  Ibrahim C. Abou-Faycal,et al.  The capacity of discrete-time memoryless Rayleigh-fading channels , 2001, IEEE Trans. Inf. Theory.

[23]  N. L. Johnson,et al.  Continuous Univariate Distributions. , 1995 .

[24]  David Williams,et al.  Probability with Martingales , 1991, Cambridge mathematical textbooks.

[25]  I. Olkin,et al.  Inequalities: Theory of Majorization and Its Applications , 1980 .

[26]  Sergio Verdú,et al.  Spectral efficiency in the wideband regime , 2002, IEEE Trans. Inf. Theory.

[27]  Emre Telatar,et al.  Capacity of Multi-antenna Gaussian Channels , 1999, Eur. Trans. Telecommun..

[28]  Lizhong Zheng,et al.  Communication on the Grassmann manifold: A geometric approach to the noncoherent multiple-antenna channel , 2002, IEEE Trans. Inf. Theory.

[29]  Amir Dembo,et al.  Simple proof of the concavity of the entropy power with respect to Gaussian noise , 1989, IEEE Trans. Inf. Theory.

[30]  S. Shamai,et al.  The capacity of discrete-time Rayleigh fading channels , 1997, Proceedings of IEEE International Symposium on Information Theory.

[31]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[32]  David Haussler,et al.  A general minimax result for relative entropy , 1997, IEEE Trans. Inf. Theory.

[33]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[34]  E. Lehmann Ordered Families of Distributions , 1955 .