Distributed near lossless coding of individual sequences X and Y is considered, where X and Y are first encoded separately and then sent to a joint decoder. Unlike distributed near lossless coding of correlated random sources, the joint decoder in distributed coding of individual sequences does not help at all. In other words, the minimum numbers of bits to be sent from X and Y respectively to the joint decoder are the same as in two independent, parallel systems where X and Y are encoded separately and decoded separately. In this paper, however, we show that by using interactive encoding and decoding where the joint decoder is allowed to interact with both separate encoders, the minimum number of total bits to be exchanged between the joint decoder and two separate encoders for each and every pair of individual sequences X and Y is the same as in the system where X and Y are jointly encoded and then jointly decoded, while X and Y can be recovered by the joint decoder in a near lossless manner.
[1]
Thomas M. Cover,et al.
A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources
,
1971
.
[2]
Rudolf Ahlswede,et al.
Source coding with side information and a converse for degraded broadcast channels
,
1975,
IEEE Trans. Inf. Theory.
[3]
En-Hui Yang,et al.
On interactive encoding and decoding for lossless source coding with decoder only side information
,
2008,
2008 IEEE International Symposium on Information Theory.
[4]
En-Hui Yang,et al.
Interactive Encoding and Decoding for One Way Learning: Near Lossless Recovery With Side Information at the Decoder
,
2010,
IEEE Transactions on Information Theory.
[5]
Jack K. Wolf,et al.
Noiseless coding of correlated information sources
,
1973,
IEEE Trans. Inf. Theory.