CHAPTER 6 – Toward Constructive Slepian–Wolf Coding Schemes

This chapter deals with practical solutions for the Slepian Wolf (SW) coding problem, which refers to the problem of lossless compression of correlated sources with coders which do not communicate. Here, we will consider the case of two binary correlated sources X and Y , characterized by their joint distribution. If the two coders communicate, it is well known from Shannon’s theory that the minimum lossless rate for X and Y is given by the joint entropy H(X,Y ). Slepian and Wolf established in 1973 [30] that this lossless compression rate bound can be approached with a vanishing error probability for infinitely long sequences, even if the two sources are coded separately, provided that they are decoded jointly and that their correlation is known to both the encoder and the decoder. Hence, the challenge is to construct a set of encoders which do not communicate and a joint decoder which can achieve the theoretical limit. This chapter gives an overview of constructive solutions both for the asymmetric and the non asymmetric SW coding problems. Asymmetric SW coding refers to the case where one source, for example Y , is transmitted at its entropy rate and used as side information to decode the second source X. Non asymmetric SW coding refers to the case where both sources are compressed at a rate lower than their respective entropy rates. Sections 2 and 3 recall the principles and then describe practical schemes for asymmetric and symmetric coding respectively. Practical solutions for which the compression rate is a priori fixed according to the correlation between the two sources are first described. In this case, the correlation between the two sources needs to be known or estimated at the transmitter. Rate-adaptive schemes in which the SW code is incremental are then presented. This chapter ends with Section 4 covering various advanced SW coding topics such as the design of schemes based on source codes and the generalization to the case of non-binary sources, and to the case of M sources.

[1]  Christine Guillemot,et al.  Overlapped Quasi-Arithmetic Codes for Distributed Video Coding , 2007, 2007 IEEE International Conference on Image Processing.

[2]  Ian H. Witten,et al.  Arithmetic coding for data compression , 1987, CACM.

[3]  InverseSyndromeFormers PeiyuTan A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers , 2005 .

[4]  Shlomo Shamai,et al.  Nested linear/Lattice codes for structured multiterminal binning , 2002, IEEE Trans. Inf. Theory.

[5]  R. Urbanke,et al.  Asynchronous Slepian-Wolf coding via source-splitting , 1997, Proceedings of IEEE International Symposium on Information Theory.

[6]  Mina Sartipi,et al.  Distributed source coding in wireless sensor networks using LDPC coding: the entire Slepian-Wolf rate region , 2005, IEEE Wireless Communications and Networking Conference, 2005.

[7]  Jorma Rissanen,et al.  Generalized Kraft Inequality and Arithmetic Coding , 1976, IBM J. Res. Dev..

[8]  Javier Garcia-Frías,et al.  Approaching the slepian-wolf boundary using practical channel codes , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[9]  Rick S. Blum,et al.  An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem , 2005, EURASIP J. Adv. Signal Process..

[10]  Kannan Ramchandran,et al.  Distributed source coding: symmetric rates and applications to sensor networks , 2000, Proceedings DCC 2000. Data Compression Conference.

[11]  Hend Alqamzi,et al.  An optimal distributed and adaptive source coding strategy using rate-compatible punctured convolutional codes , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[12]  Jing Li,et al.  Enhancing the robustness of distributed compression using ideas from channel coding , 2005, GLOBECOM '05. IEEE Global Telecommunications Conference, 2005..

[13]  Pier Luigi Dragotti,et al.  Symmetric and asymmetric Slepian-Wolf codes with systematic and nonsystematic linear codes , 2005, IEEE Communications Letters.

[14]  Zixiang Xiong,et al.  Compression of binary sources with side information at the decoder using LDPC codes , 2002, IEEE Communications Letters.

[15]  Aaron D. Wyner,et al.  Recent results in the Shannon theory , 1974, IEEE Trans. Inf. Theory.

[16]  C. Guillemot,et al.  Rate-adaptive turbo-syndrome scheme for Slepian-Wolf Coding , 2007, 2007 Conference Record of the Forty-First Asilomar Conference on Signals, Systems and Computers.

[17]  Kannan Ramchandran,et al.  Distributed source coding using syndromes (DISCUS): design and construction , 2003, IEEE Trans. Inf. Theory.

[18]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[19]  A. Kh. Al Jabri,et al.  Zero-Error Codes for Correlated Information Sources , 1997, IMACC.

[20]  Ying Zhao,et al.  Compression of correlated binary sources using turbo codes , 2001, IEEE Communications Letters.

[21]  Michelle Effros,et al.  Optimal code design for lossless and near lossless source coding in multiple access networks , 2001, Proceedings DCC 2001. Data Compression Conference.

[22]  Dake He,et al.  Rateless Slepian-Wolf Coding Based on Rate Adaptive Low-Density-Parity-Check Codes , 2007, 2007 IEEE International Symposium on Information Theory.

[23]  Bernd Girod,et al.  Rate-adaptive codes for distributed source coding , 2006, Signal Process..

[24]  Ying Zhao,et al.  Data compression of correlated non-binary sources using punctured turbo codes , 2002, Proceedings DCC 2002. Data Compression Conference.

[25]  Zixiang Xiong,et al.  Design of Slepian-Wolf codes by channel code partitioning , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[26]  Bernd Girod,et al.  Compression with side information using turbo codes , 2002, Proceedings DCC 2002. Data Compression Conference.

[27]  Zixiang Xiong,et al.  On code design for the Slepian-Wolf problem and lossless multiterminal networks , 2006, IEEE Transactions on Information Theory.

[28]  Vladimir Sidorenko,et al.  Decoding of convolutional codes using a syndrome trellis , 1994, IEEE Trans. Inf. Theory.

[29]  Dmitry Malioutov,et al.  Distributed source coding using serially-concatenated-accumulate codes , 2004, Information Theory Workshop.

[30]  Richard Clark Pasco,et al.  Source coding algorithms for fast data compression , 1976 .

[31]  Kannan Ramchandran,et al.  Distributed code constructions for the entire Slepian-Wolf rate region for arbitrarily correlated sources , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[32]  Christine Guillemot,et al.  Distributed coding using punctured quasi-arithmetic codes for memory and memoryless sources , 2009, 2009 Picture Coding Symposium.

[33]  Aline Roumy,et al.  Rate-adaptive codes for the entire Slepian-Wolf region and arbitrarily correlated sources , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[34]  Enrico Magli,et al.  Distributed Arithmetic Coding , 2007, IEEE Communications Letters.

[35]  Mina Sartipi,et al.  Distributed source coding using short to moderate length rate-compatible LDPC codes: the entire Slepian-Wolf rate region , 2008, IEEE Transactions on Communications.

[36]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[37]  Patrick Mitran,et al.  Coding for the Slepian-Wolf problem with turbo codes , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[38]  Jack K. Wolf,et al.  Efficient maximum likelihood decoding of linear block codes using a trellis , 1978, IEEE Trans. Inf. Theory.

[39]  Thomas Guionnet,et al.  Soft and Joint Source-Channel Decoding of Quasi-Arithmetic Codes , 2004, EURASIP J. Adv. Signal Process..