Capacity-achieving codes for finite-state channels with maximum-likelihood decoding

Codes on sparse graphs have been shown to achieve remarkable performance in point-to-point channels with low decoding complexity. Most of the results in this area are based on experimental evidence and/or approximate analysis. The question of whether codes on sparse graphs can achieve the capacity of noisy channels with iterative decoding is still open, and has only been conclusively and positively answered for the binary erasure channel. On the other hand, codes on sparse graphs have been proven to achieve the capacity of memoryless, binary-input, output-symmetric channels with finite graphical complexity per information bit when maximum likelihood (ML) decoding is performed. In this paper, we consider transmission over finite-state channels (FSCs). We derive upper bounds on the average error probability of code ensembles with ML decoding. Based on these bounds we show that codes on sparse graphs can achieve the symmetric information rate (SIR) of FSCs, which is the maximum achievable rate with independently and uniformly distributed input sequences. In order to achieve rates beyond the SIR, we consider a simple quantization scheme that when applied to ensembles of codes on sparse graphs induces a Markov distribution on the transmitted sequence. By deriving average error probability bounds for these quantized code ensembles, we prove that they can achieve the information rates corresponding to the induced Markov distribution, and thus approach the FSC capacity.

[1]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[2]  Toby Berger,et al.  Coding for noisy channels with input-dependent insertions , 1977, IEEE Trans. Inf. Theory.

[3]  Rüdiger L. Urbanke,et al.  Parity-check density versus performance of binary linear block codes over memoryless symmetric channels , 2003, IEEE Transactions on Information Theory.

[4]  Achilleas Anastasopoulos,et al.  Capacity-Achieving Codes With Bounded Graphical Complexity and Maximum Likelihood Decoding , 2010, IEEE Transactions on Information Theory.

[5]  Meir Feder,et al.  Random coding techniques for nonrandom codes , 1999, IEEE Trans. Inf. Theory.

[6]  Andrea Montanari,et al.  The Generalized Area Theorem and Some of its Consequences , 2005, IEEE Transactions on Information Theory.

[7]  Amin Shokrollahi,et al.  Capacity-achieving sequences for the erasure channel , 2002, IEEE Trans. Inf. Theory.

[8]  Nicolas Macris,et al.  Sharp Bounds on Generalized EXIT Functions , 2007, IEEE Transactions on Information Theory.

[9]  Daniel A. Spielman,et al.  Practical loss-resilient codes , 1997, STOC '97.

[10]  R. Gallager Information Theory and Reliable Communication , 1968 .

[11]  Pascal O. Vontobel,et al.  An upper bound on the capacity of channels with memory and constraint input , 2001, Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494).

[12]  Paul H. Siegel,et al.  Determining and Approaching Achievable Rates of Binary Intersymbol Interference Channels Using Multistage Decoding , 2007, IEEE Transactions on Information Theory.

[13]  M. Shokrollahi,et al.  Capacity-achieving sequences , 2001 .

[14]  David Burshtein,et al.  On the application of LDPC codes to arbitrary discrete-memoryless channels , 2003, IEEE Transactions on Information Theory.

[15]  Wei Zeng,et al.  Simulation-Based Computation of Information Rates for Channels With Memory , 2006, IEEE Transactions on Information Theory.

[16]  Xiao Ma,et al.  Matched information rate codes for Partial response channels , 2005, IEEE Transactions on Information Theory.

[17]  Paul H. Siegel,et al.  Markov processes asymptotically achieve the capacity of finite-state intersymbol interference channels , 2004, ISIT.

[18]  David Burshtein,et al.  Bounds on the maximum-likelihood decoding error probability of low-density parity-check codes , 2000, IEEE Trans. Inf. Theory.

[19]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[20]  P. Varaiya,et al.  Capacity, mutual information, and coding for finite-state Markov channels , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[21]  Aleksandar Kavcic On the capacity of Markov sources over noisy channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[22]  Xiao Ma,et al.  Binary intersymbol interference channels: Gallager codes, density evolution, and code performance bounds , 2003, IEEE Transactions on Information Theory.

[23]  Amin Shokrollahi,et al.  New Sequences of Linear Time Erasure Codes Approaching the Channel Capacity , 1999, AAECC.

[24]  Frank R. Kschischang,et al.  On Designing Good LDPC Codes for Markov Channels , 2007, IEEE Transactions on Information Theory.

[25]  Israel Bar-David,et al.  Capacity and coding for the Gilbert-Elliot channels , 1989, IEEE Trans. Inf. Theory.

[26]  Pulkit Grover,et al.  Upper bounds on the rate of LDPC codes for Gilbert-Elliott channels , 2004, Information Theory Workshop.

[27]  Frank R. Kschischang,et al.  Analysis of low-density parity-check codes for the Gilbert-Elliott channel , 2005, IEEE Transactions on Information Theory.

[28]  Rüdiger L. Urbanke,et al.  Capacity-achieving ensembles for the binary erasure channel with bounded complexity , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[29]  Rüdiger L. Urbanke,et al.  The capacity of low-density parity-check codes under message-passing decoding , 2001, IEEE Trans. Inf. Theory.

[30]  Hans-Andrea Loeliger,et al.  A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels , 2008, IEEE Transactions on Information Theory.

[31]  Hans-Andrea Loeliger,et al.  On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).

[32]  V. Sharma,et al.  Entropy and channel capacity in the regenerative setup with applications to Markov channels , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[33]  Paul H. Siegel,et al.  On near-capacity coding systems for partial-response channels , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[34]  Paul H. Siegel,et al.  Joint iterative decoding of LDPC codes for channels with memory and erasure noise , 2008, IEEE Journal on Selected Areas in Communications.

[35]  Radford M. Neal,et al.  Near Shannon limit performance of low density parity check codes , 1996 .

[36]  Rüdiger L. Urbanke,et al.  Modern Coding Theory , 2008 .

[37]  Sekhar Tatikonda,et al.  Feedback capacity of finite-state machine channels , 2005, IEEE Transactions on Information Theory.

[38]  Xiao Ma,et al.  Matched information rate codes for binary ISI channels , 2002, Proceedings IEEE International Symposium on Information Theory,.

[39]  Andrea Montanari,et al.  Tight bounds for LDPC and LDGM codes under MAP decoding , 2004, IEEE Transactions on Information Theory.

[40]  Andrea J. Goldsmith,et al.  Capacity of Finite State Channels Based on Lyapunov Exponents of Random Matrices , 2006, IEEE Transactions on Information Theory.

[41]  D. Burshtein,et al.  Upper bounds on the rate of LDPC codes , 2002, Proceedings IEEE International Symposium on Information Theory,.

[42]  Michael Horstein,et al.  Review of 'Low-Density Parity-Check Codes' (Gallager, R. G.; 1963) , 1964, IEEE Transactions on Information Theory.

[43]  A. Glavieux,et al.  Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[44]  Achilleas Anastasopoulos,et al.  Capacity-Achieving Codes for Noisy Channels with Bounded Graphical Complexity and Maximum Likelihood Decoding , 2006 .

[45]  Achilleas Anastasopoulos,et al.  Capacity achieving LDPC codes through puncturing , 2005, 2005 International Conference on Wireless Networks, Communications and Mobile Computing.

[46]  Igal Sason,et al.  Parity-Check Density Versus Performance of Binary Linear Block Codes: New Bounds and Applications , 2007, IEEE Transactions on Information Theory.