An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

A simple but powerful scheme exploiting the binning concept for asymmetric lossless distributed source coding is proposed. The novelty in the proposed scheme is the introduction of a syndrome former (SF) in the source encoder and an inverse syndrome former (ISF) in the source decoder to efficiently exploit an existing linear channel code without the need to modify the code structure or the decoding strategy. For most channel codes, the construction of SF-ISF pairs is a light task. For parallelly and serially concatenated codes and particularly parallel and serial turbo codes where this appear less obvious, an efficient way for constructing linear complexity SF-ISF pairs is demonstrated. It is shown that the proposed SF-ISF approach is simple, provenly optimal, and generally applicable to any linear channel code. Simulation using conventional and asymmetric turbo codes demonstrates a compression rate that is only 0.06 bit/symbol from the theoretical limit, which is among the best results reported so far.

[1]  Rick S. Blum,et al.  Compression of a binary source with side information using parallelly concatenated convolutional codes , 2004, IEEE Global Telecommunications Conference, 2004. GLOBECOM '04..

[2]  Kannan Ramchandran,et al.  Distributed source coding using syndromes (DISCUS): design and construction , 2003, IEEE Trans. Inf. Theory.

[3]  Jan Bajcsy,et al.  Serial turbo coding for data compression and the Slepian-Wolf problem , 2003, Proceedings 2003 IEEE Information Theory Workshop (Cat. No.03EX674).

[4]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[5]  Jing Li,et al.  A new coding scheme for the noisy-channel Slepian-Wolf problem: separate design and joint decoding , 2004, IEEE Global Telecommunications Conference, 2004. GLOBECOM '04..

[6]  G. David Forney Trellis shaping , 1992, IEEE Trans. Inf. Theory.

[7]  Te Sun Han,et al.  Universal coding for the Slepian-Wolf data compression system and the strong converse theorem , 1994, IEEE Trans. Inf. Theory.

[8]  Aaron D. Wyner,et al.  On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.

[9]  Shlomo Shamai,et al.  A new data compression algorithm for sources with memory based on error correcting codes , 2003, Proceedings 2003 IEEE Information Theory Workshop (Cat. No.03EX674).

[10]  Rick S. Blum,et al.  How Optimal Is Algebraic Binning Approach : A Case Study of the Turbo-Binning Scheme With Uniform and Nonuniform Sources , 2004 .

[11]  Rick S. Blum,et al.  Slepian-Wolf coding for nonuniform sources using turbo codes , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[12]  Zixiang Xiong,et al.  Distributed compression of binary sources using conventional parallel and serial concatenated convolutional codes , 2003, Data Compression Conference, 2003. Proceedings. DCC 2003.

[13]  Jun Muramatsu,et al.  Low-density parity-check matrices for coding of correlated sources , 2003, IEEE Transactions on Information Theory.

[14]  Zixiang Xiong,et al.  Compression of binary sources with side information using low-density parity-check codes , 2002, Global Telecommunications Conference, 2002. GLOBECOM '02. IEEE.

[15]  Zixiang Xiong,et al.  Compression of binary sources with side information at the decoder using LDPC codes , 2002, IEEE Communications Letters.

[16]  Zixiang Xiong,et al.  Design of Slepian-Wolf codes by channel code partitioning , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[17]  Imre Csiszár Linear codes for sources and source networks: Error exponents, universal coding , 1982, IEEE Trans. Inf. Theory.

[18]  Patrick Mitran,et al.  Coding for the Slepian-Wolf problem with turbo codes , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[19]  Bernd Girod,et al.  Compression with side information using turbo codes , 2002, Proceedings DCC 2002. Data Compression Conference.

[20]  Ramesh Pyndiah,et al.  Near-optimum decoding of product codes: block turbo codes , 1998, IEEE Trans. Commun..

[21]  Shlomo Shamai,et al.  Nested linear/Lattice codes for structured multiterminal binning , 2002, IEEE Trans. Inf. Theory.

[22]  Sergio D. Servetto,et al.  Lattice Quantization With Side Information: Codes, Asymptotics, and Applications in Sensor Networks , 2006, IEEE Transactions on Information Theory.

[23]  Aaron D. Wyner,et al.  Recent results in the Shannon theory , 1974, IEEE Trans. Inf. Theory.

[24]  Kannan Ramchandran,et al.  Distributed code constructions for the entire Slepian-Wolf rate region for arbitrarily correlated sources , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[25]  Ying Zhao,et al.  Compression of correlated binary sources using turbo codes , 2001, IEEE Communications Letters.

[26]  Dariush Divsalar,et al.  Serial Concatenation of Interleaved Codes: Performance Analysis, Design, and Iterative Decoding , 1997, IEEE Trans. Inf. Theory.

[27]  Daniel J. Costello,et al.  A note on asymmetric turbo-codes , 1999, IEEE Communications Letters.

[28]  Shlomo Shamai,et al.  Capacity of channels with uncoded side information , 1995, Eur. Trans. Telecommun..

[29]  Ying Zhao,et al.  Joint estimation and compression of correlated nonbinary sources using punctured turbo codes , 2005, IEEE Transactions on Communications.

[30]  Ying Zhao,et al.  Data compression of correlated non-binary sources using punctured turbo codes , 2002, Proceedings DCC 2002. Data Compression Conference.