Bounds on mutual information for simple codes using information combining

For coded transmission over a memoryless channel, two kinds of mutual information are considered: the mutual information between a code symbol and its noisy observation and the overall mutual information between encoder input and decoder output. The overall mutual information is interpreted as a combination of the mutual informations associated with the individual code symbols. Thus, exploiting code constraints in the decoding procedure is interpreted as combining mutual informations. For single parity check codes and repetition codes, we present bounds on the overall mutual information, which are based only on the mutual informations associated with the individual code symbols. Using these mutual information bounds, we compute bounds on extrinsic information transfer (exit) functions and bounds on information processing characteristics (ipc) for these codes.RésuméDans le cas d’une transmission codée sur un canal sans mémoire, deux types d’information mutuelle sont pris en compte : d’une part l’information mutuelle entre un symbole de code et son observation bruitée et, d’autre part, l’information mutuelle globale entre l’entrée du codeur et la sortie du décodeur. Cette dernière peut s’exprimer par une combinaison des informations mutuelles associées aux symboles de code individuel. Par conséquent, l’exploitation des contraintes du code dans le procédé de décodage peut être interprétée par une combinaison d’informations mutuelles. Pour des codes à un seul contrôle de parité et des codes à répétition, nous présentons des bornes sur l’information mutuelle globale basées uniquement sur les informations mutuelles associées aux symboles de code individuels. Utilisant ces bornes, nous calculons des bornes sur les fonctions de transfert de l’information extrinsèque (exit), ainsi que des bornes sur les caractéristiques de traitement de l’information (ipc) pour ces codes.

[1]  S. Brink Rate one-half code for approaching the Shannon limit by 0.1 dB , 2000 .

[2]  Stephan ten Brink,et al.  Convergence behavior of iteratively decoded parallel concatenated codes , 2001, IEEE Trans. Commun..

[3]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[4]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[5]  Sergio Benedetto,et al.  Unveiling turbo codes: some results on parallel concatenated coding schemes , 1996, IEEE Trans. Inf. Theory.

[6]  Johannes B. Huber,et al.  Extrinsic and intrinsic information in systematic coding , 2002, Proceedings IEEE International Symposium on Information Theory,.

[7]  Rüdiger L. Urbanke,et al.  The capacity of low-density parity-check codes under message-passing decoding , 2001, IEEE Trans. Inf. Theory.

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  Johannes B. Huber,et al.  Performance estimation for concatenated coding schemes , 2003, Proceedings 2003 IEEE Information Theory Workshop (Cat. No.03EX674).

[10]  Stephan ten Brink,et al.  Design of repeat-accumulate codes for iterative detection and decoding , 2003, IEEE Trans. Signal Process..

[11]  Robert F. H. Fischer,et al.  Information processing in soft-output decoding , 2001 .

[12]  Alain Glavieux,et al.  Reflections on the Prize Paper : "Near optimum error-correcting coding and decoding: turbo codes" , 1998 .

[13]  Rolf Johannesson,et al.  Soft-output-decoding: Some aspects from information theory , 2002 .

[14]  Dariush Divsalar,et al.  Coding theorems for 'turbo-like' codes , 1998 .

[15]  David J. C. MacKay,et al.  Good Error-Correcting Codes Based on Very Sparse Matrices , 1997, IEEE Trans. Inf. Theory.

[16]  Simon Huettinger,et al.  Information Processing and Combining in Channel Coding , 2004 .

[17]  Joachim Hagenauer,et al.  Iterative decoding of binary block and convolutional codes , 1996, IEEE Trans. Inf. Theory.

[18]  Johannes B. Huber,et al.  Bounds on information combining , 2005, IEEE Transactions on Information Theory.

[19]  J. Huber,et al.  Bounds on information combining for parity-check equations , 2004, International Zurich Seminar on Communications, 2004.

[20]  Rüdiger L. Urbanke,et al.  Design of capacity-approaching irregular low-density parity-check codes , 2001, IEEE Trans. Inf. Theory.

[21]  Stephan ten Brink,et al.  Code characteristic matching for iterative decoding of serially concatenated codes , 2001, Ann. des Télécommunications.

[22]  Shlomo Shamai,et al.  Extremes of information combining , 2005, IEEE Transactions on Information Theory.

[23]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-I , 1973, IEEE Trans. Inf. Theory.

[24]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.