The information-outage probability of finite-length codes over AWGN channels

The performance of random error control codes approaches the Shannon capacity limit as the code length goes to infinity. When the code length is finite, then the code will be unable to achieve arbitrarily low error probability and a nonzero codeword error rate is inevitable. Information-theoretic bounds on codeword error rate may be found as a function of length through traditional methods such as sphere packing. Alternatively, the behavior of finite-length codes can be characterized in terms of an information-outage probability. The information- outage probability is the probability that the mutual information, which is a random variable, is less than the rate. In this paper, a Gaussian approximation is proposed that accurately models the information-outage probability for moderately small codes. The information-outage probability is related to several previously derived bounds, including Shannon's sphere-packing and random coding bounds, as well as a bound on maximal error probability known as Feinstein's lemma. It is shown that the information- outage probability is a useful predictor of achievable error rate.

[1]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[2]  Marc P. C. Fossorier,et al.  Sphere-packing bounds revisited for moderate block lengths , 2004, IEEE Transactions on Information Theory.

[3]  Dariush Divsalar,et al.  Turbo code performance as a function of code block size , 1998, Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).

[4]  Hiroki Koga,et al.  Information-Spectrum Methods in Information Theory , 2002 .

[5]  C. Shannon Probability of error for optimal codes in a Gaussian channel , 1959 .

[6]  Dariush Divsalar,et al.  Code Performance as a Function of Block Size , 1998 .

[7]  Marcus Hutter,et al.  Distribution of Mutual Information , 2001, NIPS.

[8]  J.E. Mazo,et al.  Digital communications , 1985, Proceedings of the IEEE.

[9]  Raymond Knopp,et al.  On coding for block fading channels , 2000, IEEE Trans. Inf. Theory.

[10]  Giuseppe Caire,et al.  Error probability analysis of bit-interleaved coded modulation , 2006, IEEE Transactions on Information Theory.

[11]  Shlomo Shamai,et al.  Information theoretic considerations for cellular mobile radio , 1994 .

[12]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[13]  Giuseppe Caire,et al.  Coded modulation in the block-fading channel: coding theorems and code construction , 2006, IEEE Transactions on Information Theory.

[14]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[15]  Amiel Feinstein,et al.  Information and information stability of random variables and processes , 1964 .

[16]  Amiel Feinstein,et al.  A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.