On the capacity of finite state channels and the analysis of convolutional accumulate-m codes

What are the fundamental limits of communications channels and channel coding systems? In general, these limits manifest themselves as thresholds which separate what is possible from what is not. For example, the capacity of a communications channel is a coding rate threshold above which reliable communication is not possible. At any coding rate below capacity, however, reliable communication is possible. Likewise, all fixed rate coding schemes have channel noise thresholds above which the probability of decoding error cannot be made arbitrarily small. When the channel noise is below the threshold, many of the same coding systems can operate with very small error probability. In this dissertation, we consider the noise thresholds of Convolutional Accumulate- m (CA-m) codes, the capacity of finite state channels (FSCs), and the information rates achievable via joint iterative decoding of irregular low-density parity-check (LDPC) codes over channels with memory. CA-m codes are a class of turbo-like codes formed by serially concatenating a terminated convolutional code with a cascade of m interleaved rate-1 “accumulate” codes. The first two chapters consider these codes from two different perspectives. First, the sequence of m encoders is analyzed as a Markov chain to show that these codes converge to random codes, which are nearly optimal, as m goes to infinity. Next, a detailed threshold analysis is performed for both maximum likelihood and iterative decoding of long CA-m codes with finite m. A FSC is a discrete-time channel whose output depends on both the channel input and the channel state. A simple Monte Carlo method is introduced which estimates the achievable information rate of any FSC driven by finite memory Markov inputs. Until recently, there has been no practical method of estimating the capacity of a FSC. This Monte Carlo method enables one to compute a non-decreasing sequence of lower bounds on the capacity. The joint iterative decoding of irregular LDPC codes over channels with memory is also considered. For a class of erasure channels with memory, we derive a closed form recursion that can be used to verify necessary and sufficient conditions for successful decoding.

[1]  R. Urbanke,et al.  On the minimum distance of parallel and serially concatenated codes , 1998, Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).

[2]  E. Gilbert A comparison of signalling alphabets , 1952 .

[3]  Dariush Divsalar,et al.  Analysis, Design, and Iterative Decoding of Double Serially Concatenated Codes with Interleavers , 1998, IEEE J. Sel. Areas Commun..

[4]  Aleksandar Kavcic,et al.  Markov sources achieve the feedback capacity of finite-state machine channels , 2002, Proceedings IEEE International Symposium on Information Theory,.

[5]  D. Blackwell,et al.  Proof of Shannon's Transmission Theorem for Finite-State Indecomposable Channels , 1958 .

[6]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[7]  Donald E. Knuth,et al.  Big Omicron and big Omega and big Theta , 1976, SIGA.

[8]  John N. Pierce Limit distribution of the minimum distance of random linear codes , 1967, IEEE Trans. Inf. Theory.

[9]  Laurent Mevel,et al.  Basic Properties of the Projective Product with Application to Products of Column-Allowable Nonnegative Matrices , 2000, Math. Control. Signals Syst..

[10]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[11]  Athanasios Papoulis,et al.  Probability, Random Variables and Stochastic Processes , 1965 .

[12]  D. Divsalar A Simple Tight Bound on Error Probability of Block Codes with Application to Turbo Codes , 1999 .

[13]  A. M. Viterbi,et al.  Improved union bound on linear codes for the input-binary AWGN channel, with applications to turbo codes , 1998, Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).

[14]  Meir Feder,et al.  Random coding techniques for nonrandom codes , 1999, IEEE Trans. Inf. Theory.

[15]  T. Ericson Bounds on the size of a code , 1989 .

[16]  Amin Shokrollahi,et al.  New Sequences of Linear Time Erasure Codes Approaching the Channel Capacity , 1999, AAECC.

[17]  H. Furstenberg,et al.  Products of Random Matrices , 1960 .

[18]  Rüdiger L. Urbanke,et al.  Design of capacity-approaching irregular low-density parity-check codes , 2001, IEEE Trans. Inf. Theory.

[19]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[20]  Sergio Benedetto,et al.  Unveiling turbo codes: some results on parallel concatenated coding schemes , 1996, IEEE Trans. Inf. Theory.

[21]  Jung-Fu Cheng,et al.  Turbo Decoding as an Instance of Pearl's "Belief Propagation" Algorithm , 1998, IEEE J. Sel. Areas Commun..

[22]  V. I. Oseledec A multiplicative ergodic theorem: Lyapunov characteristic num-bers for dynamical systems , 1968 .

[23]  O. Nerman,et al.  Weak ergodicity and products of random matrices , 1993 .

[24]  J. Quadrat,et al.  Max-Plus Algebra and System Theory: Where We Are and Where to Go Now , 1999 .

[25]  A. Glavieux,et al.  Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[26]  Bruno O. Shubert,et al.  Random variables and stochastic processes , 1979 .

[27]  Persi Diaconis,et al.  Iterated Random Functions , 1999, SIAM Rev..

[28]  R. Gallager Information Theory and Reliable Communication , 1968 .

[29]  Robert J. McEliece,et al.  RA Codes Achieve AWGN Channel Capacity , 1999, AAECC.

[30]  Paul H. Siegel,et al.  On the low-rate Shannon limit for binary intersymbol interference channels , 2003, IEEE Trans. Commun..

[31]  Hui Jin,et al.  Analysis and design of turbo-like codes , 2001 .

[32]  Aleksandar Kavcic,et al.  Optimized LDPC codes for partial response channels , 2002, Proceedings IEEE International Symposium on Information Theory,.

[33]  Sandy Irani,et al.  Efficient algorithms for optimum cycle mean and optimum cost to time ratio problems , 1999, DAC '99.

[34]  Dariush Divsalar,et al.  Coding theorems for 'turbo-like' codes , 1998 .

[35]  Robert J. McEliece,et al.  Coding theorems for turbo code ensembles , 2002, IEEE Trans. Inf. Theory.

[36]  Xia Chen,et al.  Limit Theorems for Functionals of Ergodic Markov Chains With General State Space , 1999 .

[37]  Robert J. McEliece,et al.  BSC Thresholds for Code Ensembles Based on “Typical Pairs” Decoding , 2001 .

[38]  Pravin Varaiya,et al.  Capacity, mutual information, and coding for finite-state Markov channels , 1996, IEEE Trans. Inf. Theory.

[39]  Alain Glavieux,et al.  Iterative correction of intersymbol interference: Turbo-equalization , 1995, Eur. Trans. Telecommun..

[40]  V. Sharma,et al.  Entropy and channel capacity in the regenerative setup with applications to Markov channels , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[41]  Walter Hirt Capacity and information rates of discrete-time channels with memory , 1988 .

[42]  Laurent Mevel,et al.  Exponential Forgetting and Geometric Ergodicity in Hidden Markov Models , 2000, Math. Control. Signals Syst..

[43]  Rüdiger L. Urbanke,et al.  The capacity of low-density parity-check codes under message-passing decoding , 2001, IEEE Trans. Inf. Theory.

[44]  H. D. Miller A Convexity Property in the Theory of Random Variables Defined on a Finite Markov Chain , 1961 .

[45]  Robert G. Gallager,et al.  A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.

[46]  M. Shokrollahi,et al.  Capacity-achieving sequences , 2001 .

[47]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[48]  Dariush Divsalar,et al.  Turbo codes for PCS applications , 1995, Proceedings IEEE International Conference on Communications ICC '95.

[49]  Paul H. Siegel,et al.  Multilevel coding with low-density parity-check component codes , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[50]  M. Benda A central limit theorem for contractive stochastic dynamical systems , 1998 .

[51]  Aleksandar Kavcic On the capacity of Markov sources over noisy channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[52]  Niclas Wiberg,et al.  Codes and Decoding on General Graphs , 1996 .

[53]  Pascal O. Vontobel,et al.  An upper bound on the capacity of channels with memory and constraint input , 2001, Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494).

[54]  Hans-Andrea Loeliger,et al.  On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).

[55]  Dariush Divsalar,et al.  Serial and Hybrid Concatenated Codes with Applications , 1997 .

[56]  Shlomo Shamai,et al.  Variations on the Gallager bounds, connections, and applications , 2002, IEEE Trans. Inf. Theory.

[57]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[58]  Paul H. Siegel,et al.  Joint message-passing decoding of LDPC Codes and partial-response channels , 2002, IEEE Trans. Inf. Theory.

[59]  Michael Mitzenmacher,et al.  Analysis of random processes via And-Or tree evaluation , 1998, SODA '98.

[60]  Sae-Young Chung,et al.  On the design of low-density parity-check codes within 0.0045 dB of the Shannon limit , 2001, IEEE Communications Letters.

[61]  Robert Michael Tanner,et al.  A recursive approach to low complexity codes , 1981, IEEE Trans. Inf. Theory.

[62]  Daniel A. Spielman,et al.  Practical loss-resilient codes , 1997, STOC '97.

[63]  Paul H. Siegel,et al.  The serial concatenation of rate-1 codes through uniform random interleavers , 2003, IEEE Trans. Inf. Theory.

[64]  Dariush Divsalar,et al.  AWGN coding theorems from ensemble weight enumerators , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[65]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[66]  Shlomo Shamai,et al.  Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.

[67]  Xiao Ma,et al.  Binary intersymbol interference channels: Gallager codes, density evolution, and code performance bounds , 2003, IEEE Transactions on Information Theory.

[68]  D. Saad,et al.  Magnetization enumerator for LDPC codes - a statistical physics approach , 2002, Proceedings IEEE International Symposium on Information Theory,.

[69]  Richard L. Tweedie,et al.  Markov Chains and Stochastic Stability , 1993, Communications and Control Engineering Series.

[70]  E. Seneta Non-negative matrices;: An introduction to theory and applications , 1973 .