Erasure/List Exponents for Slepian-Wolf Decoding

We analyze random coding error exponents associated with erasure/list Slepian–Wolf decoding using two different methods and then compare the resulting bounds. The first method follows the well known techniques of Gallager and Forney and the second method is based on a technique of distance enumeration, or more generally, type class enumeration, which is rooted in the statistical mechanics of a disordered system that is related to the random energy model (REM). The second method is guaranteed to yield exponent functions which are at least as tight as those of the first method, and it is demonstrated that for certain combinations of coding rates and thresholds, the bounds of the second method are strictly tighter than those of the first method, by an arbitrarily large factor. In fact, the second method may even yield an infinite exponent at regions where the first method gives finite values. We also discuss the option of variable–rate Slepian–Wolf encoding and demonstrate how it can improve on the resulting exponents.

[1]  S. Kak Information, physics, and computation , 1996 .

[2]  Emre Telatar,et al.  Multi-access communications with decision feedback decoding , 1992 .

[3]  Stark C. Draper,et al.  Compound conditional source coding, Slepian-Wolf list decoding, and applications to media coding , 2007, 2007 IEEE International Symposium on Information Theory.

[4]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[5]  Imre Csiszár Linear codes for sources and source networks: Error exponents, universal coding , 1982, IEEE Trans. Inf. Theory.

[6]  Wei Yu,et al.  Rateless Slepian-Wolf Codes , 2005, Conference Record of the Thirty-Ninth Asilomar Conference onSignals, Systems and Computers, 2005..

[7]  Aaron B. Wagner,et al.  Improved Source Coding Exponents via Witsenhausen's Rate , 2011, IEEE Transactions on Information Theory.

[8]  Neri Merhav,et al.  Statistical Physics and Information Theory , 2010, Found. Trends Commun. Inf. Theory.

[9]  Neri Merhav,et al.  Exact Random Coding Exponents for Erasure Decoding , 2011, IEEE Transactions on Information Theory.

[10]  Te Sun Han,et al.  Universal coding for the Slepian-Wolf data compression system and the strong converse theorem , 1994, IEEE Trans. Inf. Theory.

[11]  G. David Forney,et al.  Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.

[12]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[13]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[14]  Imre Csiszár,et al.  Towards a general theory of source networks , 1980, IEEE Trans. Inf. Theory.