The trellis complexity of convolutional codes

Convolutional codes have a natural, regular, trellis structure that facilitates the implementation of Viterbi's algorithm. Linear block codes also have a natural, though not in general a regular, "minimal" trellis structure, which allows them to be decoded with a Viterbi-like algorithm. In both cases, the complexity of an unenhanced Viterbi decoding algorithm can be accurately estimated by the number of trellis edge symbols per encoded bit. It would therefore appear that we are in a good position to make a fair comparison of the Viterbi decoding complexity of block and convolutional codes. Unfortunately, however, this comparison is somewhat muddled by the fact that some convolutional codes, the punctured convolutional codes, are known to have trellis representations which are significantly less complex than the conventional trellis. In other words, the conventional trellis representation for a convolutional code may not be the "minimal" trellis representation. Thus ironically, we seem to know more about the minimal trellis representation for block than for convolutional codes. We provide a remedy, by developing a theory of minimal trellises for convolutional codes. This allows us to make a direct performance-complexity comparison for block and convolutional codes. A by-product of our work is an algorithm for choosing, from among all generator matrices for a given convolutional code, what we call a trellis-canonical generator matrix, from which the minimal trellis for the code can be directly constructed. Another by-product is that in the new theory, punctured convolutional codes no longer appear as a special class, but simply as high-rate convolutional codes whose trellis complexity is unexpectedly small.

[1]  Robert J. McEliece,et al.  The Theory of Information and Coding , 1979 .

[2]  Alexander Vardy,et al.  Asymptotically good codes have infinite trellis complexity , 1995, IEEE Trans. Inf. Theory.

[3]  Mitchell D. Trott,et al.  The dynamics of group codes: State spaces, trellis diagrams, and canonical encoders , 1993, IEEE Trans. Inf. Theory.

[4]  Alexander Vardy,et al.  Generalized minimum-distance decoding of Euclidean-space codes and lattices , 1996, IEEE Trans. Inf. Theory.

[5]  J. Bibb Cain,et al.  Punctured convolutional codes of rate (n-1)/n and simplified maximum likelihood decoding (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[6]  Andrew J. Viterbi,et al.  Error bounds for convolutional codes and an asymptotically optimum decoding algorithm , 1967, IEEE Trans. Inf. Theory.

[7]  G. David Forney,et al.  Coset codes-II: Binary lattices and related codes , 1988, IEEE Trans. Inf. Theory.

[8]  Bahram Honary,et al.  Minimal trellis design for linear codes based on the Shannon product , 1996, IEEE Trans. Inf. Theory.

[9]  Jr. G. Forney,et al.  The viterbi algorithm , 1973 .

[10]  D. Graupe,et al.  Punctured Convolutional Codes of Rate (n - 1)/n and Simplified Maximum Likelihood Decoding , 1979 .

[11]  R.J. McEliece,et al.  Counting minimal generator matrices , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[12]  Alexander Vardy,et al.  Proof of a conjecture of McEliece regarding the expansion index of the minimal trellis , 1996, IEEE Trans. Inf. Theory.

[13]  Ajay Dholakia Introduction to convolutional codes with applications , 1994 .

[14]  Frank R. Kschischang,et al.  On the trellis structure of block codes , 1994, IEEE Trans. Inf. Theory.

[15]  G. David Forney,et al.  Convolutional codes I: Algebraic structure , 1970, IEEE Trans. Inf. Theory.

[16]  Robert J. McEliece,et al.  On the BCJR trellis for linear block codes , 1996, IEEE Trans. Inf. Theory.

[17]  Erik Paaske,et al.  Short binary convolutional codes with maximal free distance for rates 2/3 and 3/4 (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[18]  Vladimir Sidorenko,et al.  Decoding of convolutional codes using a syndrome trellis , 1994, IEEE Trans. Inf. Theory.

[19]  Alexander Vardy,et al.  Lower bounds on trellis complexity of block codes , 1995, IEEE Trans. Inf. Theory.

[20]  Jr. G. Forney,et al.  Coset Codes-Part 11: Binary Lattices and Related Codes , 1988 .

[21]  David Haccoun,et al.  High-rate punctured convolutional codes for Viterbi and sequential decoding , 1989, IEEE Trans. Commun..

[22]  Garik Markarian,et al.  Trellis decoding for block codes. , 1997 .

[23]  Rolf Johannesson,et al.  Minimal and canonical rational generator matrices for convolutional codes , 1996, IEEE Trans. Inf. Theory.

[24]  K. Larsen,et al.  Short convolutional codes with maximal free distance for rates 1/2, 1/3, and 1/4 (Corresp.) , 1973, IEEE Trans. Inf. Theory.

[25]  R.J. McEliece The Viterbi decoding complexity of linear block codes , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[26]  Samuel Dolinar,et al.  Trellis decoding complexity of linear block codes , 1996, IEEE Trans. Inf. Theory.

[27]  K. Abdel-Ghaffar,et al.  Some Partial Unit Memory Convolutional Codes , 1991, Proceedings. 1991 IEEE International Symposium on Information Theory.

[28]  Douglas J. Muder Minimal trellises for block codes , 1988, IEEE Trans. Inf. Theory.

[29]  David G. Daut,et al.  New short constraint length convolutional code constructions for selected rational rates , 1982, IEEE Trans. Inf. Theory.

[30]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[31]  Gregory S. Lauer Some optimal partial-unit-memory codes (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[32]  David Haccoun,et al.  High-rate punctured convolutional codes: structure properties and construction technique , 1989, IEEE Trans. Commun..

[33]  Jack K. Wolf,et al.  Efficient maximum likelihood decoding of linear block codes using a trellis , 1978, IEEE Trans. Inf. Theory.