The following coding problem for correlated discrete memoryless sources is considered. The two sources can be separately block encoded, and the values of the encoding functions are available to a decoder who wants to answer a certain question concerning the source outputs. Typically, this question has only a few possible answers (even as few as two). The rates of the encoding functions must be found that enable the decoder to answer this question correctly with high probability. It is proven that these rates are often as large as those needed for a full reproduction of the outputs of both sources. Furthermore, if one source is completely known at the decoder, this phenomenon already occurs when what is asked for is the joint type (joint composition) of the two source output blocks, or some function thereof such as the Hamming distance of the two blocks or (for alphabet size at least three) just the parity of this Hamming distance.
[1]
Morris Plotkin,et al.
Binary codes with specified minimum distance
,
1960,
IRE Trans. Inf. Theory.
[2]
Aaron D. Wyner,et al.
The rate-distortion function for source coding with side information at the decoder
,
1976,
IEEE Trans. Inf. Theory.
[3]
Rudolf Ahlswede,et al.
Channel capacities for list codes
,
1973,
Journal of Applied Probability.
[4]
Rudolf Ahlswede,et al.
Coloring hypergraphs: A new approach to multi-user source coding, 1
,
1979
.
[5]
János Körner,et al.
Some Methods in Multi-User Communication: A Tutorial Survey
,
1975
.
[6]
R. A. McDonald,et al.
Noiseless Coding of Correlated Information Sources
,
1973
.