Performance evaluation of list sequence MAP decoding

List-sequence (LS) decoding has the potential to yield significant coding gain additional to that of conventional single-sequence decoding, and it can be implemented with full backward compatibility in systems where an error-detecting code is concatenated with an error-correcting code. LS maximum-likelihood (ML) decoding provides a list of estimated sequences in likelihood order. For convolutional codes, this list can be obtained with the serial list Viterbi algorithm (SLVA). Through modification of the metric increments of the SLVA, an LS maximum a posteriori (MAP) probability decoding algorithm is obtained that takes into account bitwise a priori probabilities and produces an ordered list of sequence MAP estimates. The performance of the resulting LS-MAP decoding algorithm is studied in this paper. Computer simulations and approximate analytical expressions, based on geometrical considerations of the decision domains of LS decoders, are presented. We focus on the frame-error performance of LS-MAP decoding, with genie-assisted error detection, on the additive white Gaussian noise channel. It is concluded that LS-MAP decoding exploits a priori information more efficiently, in order to achieve performance improvements, than does conventional single-sequence MAP decoding. Interestingly, LS-MAP decoding can provide significant improvements at low signal-to-noise ratios, compared with LS-ML decoding. In this environment, it is furthermore observed that feedback convolutional codes offer performance improvements over their feedforward counterparts. Since LS-MAP decoding can be implemented in existing systems at a modest complexity increase, it should have a wide area of applications, such as joint source-channel decoding and other kinds of iterative decoding.

[1]  Joachim Hagenauer,et al.  A Viterbi algorithm with soft-decision outputs and its applications , 1989, IEEE Global Telecommunications Conference, 1989, and Exhibition. 'Communications Technology for the 1990s and Beyond.

[2]  Peter Jung Novel low complexity decoder for turbo-codes , 1995 .

[3]  Nam C. Phamdo,et al.  Analysis and Design of Trellis Codes Optimized for a Binary Symmetric Markov Source with MAP Detection , 1998, IEEE Trans. Inf. Theory.

[4]  Carl-Erik W. Sundberg,et al.  List Viterbi decoding algorithms with applications , 1994, IEEE Trans. Commun..

[5]  Sergio Benedetto,et al.  Unveiling turbo codes: some results on parallel concatenated coding schemes , 1996, IEEE Trans. Inf. Theory.

[6]  E. Ziegel Beta Mathematics Handbook , 1991 .

[7]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[8]  A. Glavieux,et al.  Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[9]  Frank K. Soong,et al.  A Tree.Trellis Based Fast Search for Finding the N Best Sentence Hypotheses in Continuous Speech Recognition , 1990, HLT.

[10]  Allan S. Y. Grady,et al.  Beta : Mathematics Handbook , 1991, The Mathematical Gazette.

[11]  Joachim Hagenauer Source-controlled channel decoding , 1995, IEEE Trans. Commun..

[12]  Gordon L. Stüber,et al.  List decoding of turbo codes , 1998, IEEE Trans. Commun..

[13]  Frank K. Soong,et al.  A Tree.Trellis Based Fast Search for Finding the N Best Sentence Hypotheses in Continuous Speech Recognition , 1990, HLT.

[14]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[15]  Fady Alajaji,et al.  Sequence MAP decoding of trellis codes for Gaussian and Rayleigh channels , 1999 .

[16]  N. Phamdo,et al.  Recursive method for generating weight enumerating functions of trellis codes , 2001 .

[17]  I. M. Jacobs,et al.  Principles of Communications Engineering , 1966 .

[18]  Carl-Erik W. Sundberg,et al.  The max-log list algorithm (MLLA)-a list-sequence decoding algorithm that provides soft-symbol output , 2005, IEEE Transactions on Communications.