Let \{(U_{i},V_{i})\}_{i=1}^{n} be a source of independent identically distributed (i.i.d.) discrete random variables with joint probability mass function p(u,v) and common part w=f(u)=g(v) in the sense of Witsenhausen, Gacs, and Korner. It is shown that such a source can be sent with arbitrarily small probability of error over a multiple access channel (MAC) \{\cal X_{1} \times \cal X_{2},\cal Y,p(y|x_{1},x_{2})\}, with allowed codes \{x_{l}(u), x_{2}(v)\} if there exist probability mass functions p(s), p(x_{1}|s,u),p(x_{2}|s,v) , such that H(U|V) H(V|U ) H(U,V|W) H(U,V) \mbox{where} p(s,u,v,x_{1},x_{2},y), Xl, X2, y)=p(s)p(u,v)p(x_{1}|u,s)p(x_{2}|v,s)p(y|x_{1},x_{2}). lifts region includes the multiple access channel region and the Slepian-Wolf data compression region as special cases.
[1]
D. Slepian,et al.
A coding theorem for multiple access channels with correlated sources
,
1973
.
[2]
Thomas M. Cover,et al.
An achievable rate region for the broadcast channel
,
1975,
IEEE Trans. Inf. Theory.
[3]
Rudolf Ahlswede,et al.
Multi-way communication channels
,
1973
.
[4]
H. Witsenhausen.
ON SEQUENCES OF PAIRS OF DEPENDENT RANDOM VARIABLES
,
1975
.
[5]
Jack K. Wolf,et al.
Noiseless coding of correlated information sources
,
1973,
IEEE Trans. Inf. Theory.
[6]
Thomas M. Cover,et al.
A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources
,
1971
.