The capacity of average and peak-power-limited quadrature Gaussian channels

The capacity C(/spl rho//sub a/, /spl rho//sub p/) of the discrete-time quadrature additive Gaussian channel (QAGC) with inputs subjected to (normalized) average and peak power constraints, /spl rho//sub a/ and /spl rho//sub p/ respectively, is considered. By generalizing Smith's results for the scalar average and peak-power-constrained Gaussian channel, it is shown that the capacity achieving distribution is discrete in amplitude (envelope), having a finite number of mass-points, with a uniformly distributed independent phase and it is geometrically described by concentric circles. It is shown that with peak power being solely the effective constraint, a constant envelope with uniformly distributed phase input is capacity achieving for /spl rho//sub p//spl les/7.8 (dB 4.8 (dB) per dimension). The capacity under a peak-power constraint is evaluated for a wide range of /spl rho//sub p/, by incorporating the theoretical observations into a nonlinear dynamic programming procedure. Closed-form expressions for the asymptotic (low and large /spl rho//sub a/ and /spl rho//sub p/) capacity and the corresponding capacity achieving distribution and for lower and upper bounds on the capacity C(/spl rho//sub a/, /spl rho//sub p/) are developed. The capacity C(/spl rho//sub a/, /spl rho//sub p/) provides an improved ultimate upper bound on the reliable information rates transmitted over the QAGC with any communication systems subjected to both average and peak-power limitations, when compared to the classical Shannon formula for the capacity of the QAGC which does not account for the peak-power constraint. This is in particular important for systems that operate with restrictive (close to 1) average-to-peak power ratio /spl rho//sub a///spl rho//sub p/ and at moderate power values. >

[1]  I. Bar-David On the Capacity of Peak Power Constrained Gaussian Channels , 1988 .

[2]  G. Einarsson Signal Design for the Amplitude-Limited Gaussian Channel by Error Bound Optimization , 1979, IEEE Trans. Commun..

[3]  Richard D. Gitlin,et al.  Optimization of Two-Dimensional Signal Constellations in the Presence of Gaussian Noise , 1974, IEEE Trans. Commun..

[4]  Gordon Raisbeck,et al.  Transmission of photographic data by electrical transmission (Corresp.) , 1960, IRE Trans. Inf. Theory.

[5]  Joel G. Smith,et al.  The Information Capacity of Amplitude- and Variance-Constrained Scalar Gaussian Channels , 1971, Inf. Control..

[6]  Shlomo Shamai,et al.  Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.

[7]  Shlomo Shamai,et al.  On the capacity of a gaussian channel with peak power and brand limited input signals , 1988 .

[8]  N. Blachman A Comparison of the Informational Capacities of Amplitude- and Phase-Modulation Communication Systems , 1953, Proceedings of the IRE.

[9]  Richard E. Blahut,et al.  Principles and practice of information theory , 1987 .

[10]  G. David Forney,et al.  Coset codes-I: Introduction and geometrical classification , 1988, IEEE Trans. Inf. Theory.

[11]  Hans S. Witsenhausen Some aspects of convexity useful in information theory , 1980, IEEE Trans. Inf. Theory.

[12]  Adel A. M. Saleh,et al.  On the Computational Cutoff Rate, R0for the Peak-Power-Limited Gaussian Channel , 1987, IEEE Trans. Commun..

[13]  G. David Forney,et al.  Efficient Modulation for Band-Limited Channels , 1984, IEEE J. Sel. Areas Commun..

[14]  G. David Forney,et al.  Advanced Modulation Techniques for V.Fast , 1993, Eur. Trans. Telecommun..

[15]  Claude E. Shannon,et al.  A Mathematical Theory of Communications , 1948 .

[16]  C. Thomas,et al.  Digital Amplitude-Phase Keying with M-Ary Alphabets , 1974, IEEE Trans. Commun..

[17]  Jerzy Seidler,et al.  Bounds on the mean-square error and the quality of domain decisions based on mutual information , 1971, IEEE Trans. Inf. Theory.

[18]  M. Schwartz,et al.  Communication Systems and Techniques , 1996, IEEE Communications Magazine.

[19]  G. David Forney,et al.  Multidimensional constellations. I. Introduction, figures of merit, and generalized cross constellations , 1989, IEEE J. Sel. Areas Commun..

[20]  A. D. Wyner,et al.  Bounds on communication with polyphase coding , 1966 .

[21]  Aaron D. Wyner,et al.  On the capacity of the Gaussian channel with a finite number of input levels , 1990, IEEE Trans. Inf. Theory.

[22]  Günter Söder,et al.  Die Kanalkapazität als Grenze für die Digitalsignalübertragung , 1985 .

[23]  W. Oettli Capacity-Achieving Input Distributions for So . me Ampli tude-Limi ted Channels with Additive Noise , .

[24]  Nelson M. Blachman,et al.  The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.

[25]  Werner Oettli Capacity-achieving input distributions for some amplitude-limited channels with additive noise (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[26]  Steven A. Tretter,et al.  On optimal shaping of multidimensional constellations , 1994, IEEE Trans. Inf. Theory.

[27]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[28]  Amir K. Khandani,et al.  Shaping multidimensional signal spaces - I: Optimum shaping, shell mapping , 1993, IEEE Trans. Inf. Theory.

[29]  D. Luenberger Optimization by Vector Space Methods , 1968 .

[30]  Gottfried Ungerboeck,et al.  Channel coding with multilevel/phase signals , 1982, IEEE Trans. Inf. Theory.

[31]  R. Clark Jones,et al.  Information Capacity of Photographic Films , 1961 .

[32]  A. R. Calderbank The mathematics of moderns , 1991 .

[33]  G.D. Forney,et al.  Combined equalization and coding using precoding , 1991, IEEE Communications Magazine.

[34]  Dariush Divsalar,et al.  Introduction to Trellis-Coded Modulation With Applications , 1991 .