Information Combining

Consider coded transmission over a binary-input symmetric memoryless channel. The channel decoder uses the noisy observations of the code symbols to reproduce the transmitted code symbols. Thus, it combines the information about individual code symbols to obtain an over-all information about each code symbol, which may be the reproduced code symbol or its a-posteriori probability. This tutorial addresses the problem of "information combining" from an information-theory point of view: the decoder combines the mutual information between channel input symbols and channel output symbols (observations) to the mutual information between one transmitted symbol and all channel output symbols. The actual value of the combined information depends on the statistical structure of the channels. However, it can be upper and lower bounded for the assumed class of channels. This book first introduces the concept of mutual information profiles and revisits the well-known Jensen's inequality. Using these tools, the bounds on information combining are derived for single parity-check codes and for repetition codes. The application of the bounds is illustrated in four examples: information processing characteristics of coding schemes, including extrinsic information transfer (EXIT) functions; design of multiple turbo codes; bounds for the decoding threshold of low-density parity-check codes; EXIT function of the accumulator.

[1]  Sae-Young Chung,et al.  Analysis of sum-product decoding of low-density parity-check codes using a Gaussian approximation , 2001, IEEE Trans. Inf. Theory.

[2]  Johannes B. Huber,et al.  Information processing in ideal coding schemes with code-symbol decoding , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[3]  Robert F. H. Fischer,et al.  Information processing in soft-output decoding , 2001 .

[4]  Ingmar Land,et al.  Reliability information in channel decoding: practical aspects and information theoretical bounds , 2005 .

[5]  Martin E. Hellman,et al.  Probability of error, equivocation, and the Chernoff bound , 1970, IEEE Trans. Inf. Theory.

[6]  F. Jelinek Fast sequential decoding algorithm using a stack , 1969 .

[7]  Johannes B. Huber,et al.  Performance estimation for concatenated coding schemes , 2003, Proceedings 2003 IEEE Information Theory Workshop (Cat. No.03EX674).

[8]  Stephan ten Brink,et al.  Design of repeat-accumulate codes for iterative detection and decoding , 2003, IEEE Trans. Signal Process..

[9]  Stephan ten Brink,et al.  Convergence behavior of iteratively decoded parallel concatenated codes , 2001, IEEE Trans. Commun..

[10]  Rüdiger L. Urbanke,et al.  Design of capacity-approaching irregular low-density parity-check codes , 2001, IEEE Trans. Inf. Theory.

[11]  Peter A. Hoeher,et al.  Bounds on information combining for the accumulator of repeat-accumulate codes without Gaussian assumption , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[12]  Fredrik Brannstrom,et al.  Convergence Analysis and Design of Multiple Concatenated Codes , 2004 .

[13]  Shlomo Shamai,et al.  Extremes of information combining , 2005, IEEE Transactions on Information Theory.

[14]  Johannes B. Huber,et al.  Bounds on mutual information for simple codes using information combining , 2005, Ann. des Télécommunications.

[15]  Stephan ten Brink,et al.  Extrinsic information transfer functions: model and erasure channel properties , 2004, IEEE Transactions on Information Theory.

[16]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[17]  Johannes B. Huber,et al.  Design of \Multiple{Turbo{Codes" with Transfer Characteristics of Component Codes , 2002 .

[18]  Alain Glavieux,et al.  Reflections on the Prize Paper : "Near optimum error-correcting coding and decoding: turbo codes" , 1998 .

[19]  R. Gallager Information Theory and Reliable Communication , 1968 .

[20]  Joachim Hagenauer,et al.  Iterative decoding of binary block and convolutional codes , 1996, IEEE Trans. Inf. Theory.

[21]  Rolf Johannesson,et al.  Soft-output-decoding: Some aspects from information theory , 2002 .

[22]  Patrick Robertson,et al.  A comparison of optimal and sub-optimal MAP decoding algorithms operating in the log domain , 1995, Proceedings IEEE International Conference on Communications ICC '95.

[23]  Krishna R. Narayanan,et al.  An MSE Based Ttransfer Chart to Analyze Iterative Decoding Schemes , 2005, ArXiv.

[24]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[25]  Johannes B. Huber,et al.  Analytical derivation of EXIT charts for simple block codes and for LDPC codes using information combining , 2004, 2004 12th European Signal Processing Conference.

[26]  Simon Litsyn,et al.  EXIT functions for binary input memoryless symmetric channels , 2006, IEEE Transactions on Communications.

[27]  David J. C. MacKay,et al.  Good Error-Correcting Codes Based on Very Sparse Matrices , 1997, IEEE Trans. Inf. Theory.

[28]  Johannes B. Huber,et al.  Bounds on information combining , 2005, IEEE Transactions on Information Theory.

[29]  Radford M. Neal,et al.  Near Shannon limit performance of low density parity check codes , 1996 .

[30]  A. Glavieux,et al.  Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[31]  Shlomo Shamai,et al.  Extension of an entropy property for binary input memoryless symmetric channels , 1989, IEEE Trans. Inf. Theory.

[32]  Johannes B. Huber,et al.  Extrinsic and intrinsic information in systematic coding , 2002, Proceedings IEEE International Symposium on Information Theory,.

[33]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.

[34]  Joachim Hagenauer,et al.  Iterative list-sequential (LISS) detector for fading multiple-access channels , 2004, IEEE Global Telecommunications Conference, 2004. GLOBECOM '04..

[35]  J. Huber,et al.  Bounds on information combining for parity-check equations , 2004, International Zurich Seminar on Communications, 2004.

[36]  Andrew C. Singer,et al.  Extremal problems of information combining , 2005, ISIT.

[37]  Brendan J. Frey,et al.  Factor graphs and the sum-product algorithm , 2001, IEEE Trans. Inf. Theory.

[38]  H.-A. Loeliger,et al.  An introduction to factor graphs , 2004, IEEE Signal Process. Mag..

[39]  Rüdiger L. Urbanke,et al.  The capacity of low-density parity-check codes under message-passing decoding , 2001, IEEE Trans. Inf. Theory.

[40]  Stephan ten Brink,et al.  Design of low-density parity-check codes for modulation and detection , 2004, IEEE Transactions on Communications.

[41]  J. Huber,et al.  Grundlagen der Wahrscheinlichkeitsrechnung für iterative Decodierverfahren , 2002 .

[42]  H. Jin,et al.  Irregular repeat accumulate codes , 2000 .

[43]  Shlomo Shamai,et al.  Constrained Information Combining: Theory and Applications for LDPC Coded Systems , 2007, IEEE Transactions on Information Theory.

[44]  David Burshtein,et al.  Bounds on the performance of belief propagation decoding , 2002, IEEE Trans. Inf. Theory.

[45]  Stephan ten Brink,et al.  Design of Serially Concatenated Codes based on Iterative Decoding Convergence , 2000 .

[46]  Dariush Divsalar,et al.  Low Complexity Turbo-like Codes , 2000 .

[47]  John B. Anderson,et al.  Sequential Coding Algorithms: A Survey and Cost Analysis , 1984, IEEE Trans. Commun..

[48]  Sergio Benedetto,et al.  Unveiling turbo codes: some results on parallel concatenated coding schemes , 1996, IEEE Trans. Inf. Theory.

[49]  Patrick Robertson,et al.  Optimal and sub-optimal maximum a posteriori algorithms suitable for turbo decoding , 1997, Eur. Trans. Telecommun..

[50]  Johannes B. Huber,et al.  Analysis and design of power-efficient coding schemes with parallel concatenated convolutional codes , 2006, IEEE Transactions on Communications.

[51]  Dariush Divsalar,et al.  Coding theorems for 'turbo-like' codes , 1998 .

[52]  Stephan ten Brink,et al.  Code characteristic matching for iterative decoding of serially concatenated codes , 2001, Ann. des Télécommunications.