Joint source-channel decoding of correlated sources over noisy channels

We consider the case of two correlated binary information sequences. Instead of compressing the information using source coding, both sequences are independently channel encoded, and transmitted over an AWGN channel. The correlation between both sequences is exploited at the receiver, allowing reliable communications at signal to noise ratios very close to the theoretical limits established by the combination of Shannon and Slepian-Wolf theorems.

[1]  R. A. McDonald,et al.  Noiseless Coding of Correlated Information Sources , 1973 .

[2]  H. S. WITSENHAUSEN,et al.  The zero-error side information problem and chromatic numbers (Corresp.) , 1976, IEEE Trans. Inf. Theory.

[3]  Robert M. Gray,et al.  Encoding of correlated observations , 1987, IEEE Trans. Inf. Theory.

[4]  A. Glavieux,et al.  Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[5]  A. Kh. Al Jabri,et al.  Zero-Error Codes for Correlated Information Sources , 1997, IMACC.

[6]  T. Berger,et al.  On instantaneous codes for zero-error coding of two correlated sources , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[7]  S. Sandeep Pradhan,et al.  A constructive approach to distributed source coding with symmetric rates , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).