Revisiting the Slepian-Wolf coding problem for general sources: A direct approach

This paper clarifies the ε-achievable rate region of the Slepian-Wolf (SW) coding problem for general sources. We propose new upper and lower bounds on the error probability of the SW coding system for finite block lengths. The proposed bounds are mathematically simple and characterized by an optimization problem on the subset of pairs of output sequences which is closely related to the smooth max-entropy, and are tighter than those obtained by Han. By using these bounds, we clarify the ε-achievable rate region. Further, we also show outer and inner bounds on the ε-achievable rate region in terms of the smooth max-entropy. These two bounds coincide when the error probability vanishes.

[1]  Thomas M. Cover,et al.  A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources , 1971 .

[2]  W. Marsden I and J , 2012 .

[3]  R. A. McDonald,et al.  Noiseless Coding of Correlated Information Sources , 1973 .

[4]  Sergio Verdú,et al.  New results in the theory of identification via channels , 1992, IEEE Trans. Inf. Theory.

[5]  F. Kanaya,et al.  Coding Theorems on Correlated General Sources , 1995 .

[6]  Renato Renner,et al.  Trade-Offs in Information-Theoretic Multi-party One-Way Key Agreement , 2007, ICITS.

[7]  Renato Renner,et al.  Simple and Tight Bounds for Information Reconciliation and Privacy Amplification , 2005, ASIACRYPT.

[8]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[9]  Renato Renner,et al.  Smooth Renyi entropy and applications , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[10]  Nilanjana Datta,et al.  Smooth Entropies and the Quantum Information Spectrum , 2009, IEEE Transactions on Information Theory.

[11]  Naresh Sharma,et al.  Non-asymptotic information theoretic bound for some multi-party scenarios , 2011, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[12]  Hiroki Koga,et al.  Information-Spectrum Methods in Information Theory , 2002 .

[13]  Claude E. Shannon,et al.  A Mathematical Theory of Communications , 1948 .

[14]  Gou Hosoya,et al.  国際会議参加報告:2014 IEEE International Symposium on Information Theory , 2014 .

[15]  John C. Kieffer,et al.  $\epsilon$-Capacity of Binary Symmetric Averaged Channels , 2007, IEEE Transactions on Information Theory.

[16]  Sergio Verdú,et al.  Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.