Error exponents for channel coding with application to signal constellation design

This paper concerns error exponents and the structure of input distributions maximizing the random coding exponent for a stochastic channel model. The following conclusions are obtained under general assumptions on the channel statistics. 1) The optimal distribution has a finite number of mass points, or in the case of a complex channel, the amplitude has finite support. 2) A new class of algorithms is introduced based on the cutting-plane method to construct an optimal input distribution. The algorithm constructs a sequence of discrete distributions, along with upper and lower bounds on the random coding exponent at each iteration. 3) In some numerical example considered, the resulting code significantly outperforms traditional signal constellation schemes such as quadrature amplitude modulation and phase-shift keying for all rates below the capacity

[1]  David J. C. MacKay,et al.  Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.

[2]  Shlomo Shamai,et al.  On the capacity-achieving distribution of the discrete-time non-coherent additive white Gaussian noise channel , 2002, Proceedings IEEE International Symposium on Information Theory,.

[3]  Frank R. Kschischang,et al.  Capacity-achieving probability measure for conditionally Gaussian channels with bounded inputs , 2005, IEEE Transactions on Information Theory.

[4]  Alexander Barg,et al.  Random codes: Minimum distances and error exponents , 2002, IEEE Trans. Inf. Theory.

[5]  Sean P. Meyn,et al.  Extremal distributions in information theory and hypothesis testing , 2004, Information Theory Workshop.

[6]  Alexander Barg,et al.  Error Exponents of Expander Codes under Linear-Complexity Decoding , 2004, SIAM J. Discret. Math..

[7]  S. Kullback,et al.  Information Theory and Statistics , 1959 .

[8]  Amos Lapidoth,et al.  Duality Bounds on the Cut-Off Rate with Applications to Ricean Fading , 2004, ISIT.

[9]  Ofer Zeitouni,et al.  On universal hypotheses testing via large deviations , 1991, IEEE Trans. Inf. Theory.

[10]  Suguru Arimoto Computation of random coding exponent functions , 1976, IEEE Trans. Inf. Theory.

[11]  Amos Lapidoth,et al.  Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels , 2003, IEEE Trans. Inf. Theory.

[12]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .

[13]  R. R. Bahadur Some Limit Theorems in Statistics , 1987 .

[14]  Nimrod Megiddo,et al.  Advances in Economic Theory: On the complexity of linear programming , 1987 .

[15]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[16]  Mung Chiang,et al.  Geometric Programming for Communication Systems , 2005, Found. Trends Commun. Inf. Theory.

[17]  Imre Csisźar,et al.  The Method of Types , 1998, IEEE Trans. Inf. Theory.

[18]  D. Vernon Inform , 1995, Encyclopedia of the UN Sustainable Development Goals.

[19]  Jorma Rissanen,et al.  Stochastic Complexity in Statistical Inquiry , 1989, World Scientific Series in Computer Science.

[20]  R. Gallager Power Limited Channels: Coding, Multiaccess, and Spread Spectrum , 2002 .

[21]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[22]  I. Csiszár Sanov Property, Generalized $I$-Projection and a Conditional Limit Theorem , 1984 .

[23]  Shlomo Shamai,et al.  The capacity of average and peak-power-limited quadrature Gaussian channels , 1995, IEEE Trans. Inf. Theory.

[24]  Venkat Anantharam A large deviations approach to error exponents in source coding and hypothesis testing , 1990, IEEE Trans. Inf. Theory.

[25]  Shirley Dex,et al.  JR 旅客販売総合システム(マルス)における運用及び管理について , 1991 .

[26]  Richard E. Blahut,et al.  Principles and practice of information theory , 1987 .

[27]  Robert G. Gallager,et al.  The random coding bound is tight for the average code (Corresp.) , 1973, IEEE Trans. Inf. Theory.

[28]  Elwyn R. Berlekamp,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..

[29]  A. Charnes,et al.  The role of duality in optimization problems involving entropy functionals with applications to information theory , 1988 .

[30]  R. Gallager Information Theory and Reliable Communication , 1968 .

[31]  Richard E. Blahut,et al.  Hypothesis testing and information theory , 1974, IEEE Trans. Inf. Theory.

[32]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[33]  Dimitri P. Bertsekas,et al.  Nonlinear Programming , 1997 .

[34]  H. Vincent Poor,et al.  Noncoherent Rician fading Channel-part II: spectral efficiency in the low-power regime , 2005, IEEE Transactions on Wireless Communications.

[35]  Robert J. McEliece,et al.  Coding theorems for turbo code ensembles , 2002, IEEE Trans. Inf. Theory.

[36]  Charuhas Pandit Robust Statistical Modeling Based on Moment Classes, With Applications to Admission Control, Large Deviations and Hypothesis Testing , 2004 .

[37]  Amos Lapidoth,et al.  Duality bounds on the cutoff rate with applications to Ricean fading , 2004, IEEE Transactions on Information Theory.

[38]  Sean P. Meyn,et al.  Worst-case large-deviation asymptotics with application to queueing and information theory , 2006 .

[39]  W. Hoeffding Asymptotically Optimal Tests for Multinomial Distributions , 1965 .

[40]  Frederick Jelinek,et al.  Probabilistic Information Theory: Discrete and Memoryless Models , 1968 .

[41]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[42]  Ibrahim C. Abou-Faycal,et al.  The capacity of discrete-time memoryless Rayleigh-fading channels , 2001, IEEE Trans. Inf. Theory.

[43]  Sergio Verdú,et al.  On channel capacity per unit cost , 1990, IEEE Trans. Inf. Theory.

[44]  Gerald Matz,et al.  Information geometric formulation and interpretation of accelerated Blahut-Arimoto-type algorithms , 2004, Information Theory Workshop.

[45]  Toby Berger,et al.  Digital Compression For Multimedia Principles And Standards , 1998 .

[46]  Alexander Barg,et al.  Error exponents of expander codes , 2002, IEEE Trans. Inf. Theory.

[47]  G. David Forney,et al.  Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.

[48]  Sean P. Meyn,et al.  Characterization and computation of optimal distributions for channel coding , 2005, IEEE Transactions on Information Theory.

[49]  Upamanyu Madhow,et al.  On fixed input distributions for noncoherent communication over high-SNR Rayleigh-fading channels , 2004, IEEE Transactions on Information Theory.

[50]  Shlomo Shamai,et al.  Variations on the Gallager bounds, connections, and applications , 2002, IEEE Trans. Inf. Theory.

[51]  H. Vincent Poor,et al.  The noncoherent rician fading Channel-part I: structure of the capacity-achieving input , 2005, IEEE Transactions on Wireless Communications.

[52]  Sean P. Meyn,et al.  Asymptotic robust Neyman-Pearson hypothesis testing based on moment classes , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[53]  O. F. Cook The Method of Types , 1898 .

[54]  J. Borwein,et al.  A Survey of Convergence Results for Maximum Entropy Methods , 1993 .

[55]  S. Fang,et al.  Entropy Optimization and Mathematical Programming , 1997 .

[56]  Joel G. Smith,et al.  The Information Capacity of Amplitude- and Variance-Constrained Scalar Gaussian Channels , 1971, Inf. Control..

[57]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.