The Optimal Density of Infinite Constellations for the Gaussian Channel

The setting of a Gaussian channel without power constraints is considered. In this setting the codewords are points in an n-dimensional Euclidean space (an infinite constellation). The channel coding analog of the number of codewords is the density of the constellation points, and the analog of the communication rate is the normalized log density (NLD). The highest achievable NLD with vanishing error probability (which can be thought of as the capacity) is known, as well as error exponents for the setting. In this work we are interested in t he optimal NLD for communication when a fixed, nonzero error probability is allowed. In classi cal channel coding the gap to capacity is characterized by the channel dispersion (and cannot be derived from error exponent theory). In the unconstrained setting, we show that as the codeword length (dimension) n grows, the gap to the highest achievable NLD is inversely proportional (to the first order) to the square root of the block length. We give an explicit expression for the proportion constant, which is given by the inverse Q-function of the allowed error probability, times the square root of 1 . In an analogy to a similar result in classical channel coding, it follows tha t the dispersion of infinite constellations is given by 1 nat 2 per channel use. We show that this optimal convergence rate can be achieved using lattices, therefore the result holds for the maximal e rror probability as well. Connections to the error exponent of the power constrained Gaussian channel and to the volume-to-noise ratio as a figure of merit are discussed.

[1]  Uri Erez,et al.  Achieving 1/2 log (1+SNR) on the AWGN channel with lattice encoding and decoding , 2004, IEEE Transactions on Information Theory.

[2]  D. Cox,et al.  Asymptotic techniques for use in statistics , 1989 .

[3]  Gregory Poltyrev,et al.  On coding without restrictions for the AWGN channel , 1993, IEEE Trans. Inf. Theory.

[4]  Meir Feder,et al.  Parallel bit interleaved coded modulation , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[5]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[6]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[7]  P. Shiu,et al.  Geometric and analytic number theory , 1991 .

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  C. Shannon Probability of error for optimal codes in a Gaussian channel , 1959 .

[10]  Alexander Vardy,et al.  Universal Bound on the Performance of Lattice Codes , 1999, IEEE Trans. Inf. Theory.

[11]  R. Zamir Lattices are everywhere , 2009, 2009 Information Theory and Applications Workshop.

[12]  Raghu Kacker,et al.  Digital Library of Mathematical Functions , 2003 .

[13]  G. David Forney,et al.  Modulation and Coding for Linear Gaussian Channels , 1998, IEEE Trans. Inf. Theory.

[14]  Feller William,et al.  An Introduction To Probability Theory And Its Applications , 1950 .