On zero-error coding of correlated sources

The problem of separate zero-error coding of correlated sources is considered. Inner and outer single-letter bounds are established for the achievable rate region, and conditions for their coincidence are investigated. It is shown that successive encoding combined with time sharing is not always an optimal coding strategy. Conditions for its optimality are derived. The inner bound to the achievable rate region follows as a special case of the single-letter characterization of a generalized zero-error multiterminal rate-distortion problem. The applications of this characterization to a problem of remote computing are also explored. Other results include (i) a product-space characterization of the achievable rates, (ii) bounds for finite block length, and (iii) asymptotic fixed-length rates.

[1]  J. Körner,et al.  Graphs that Split Entropies , 1988, SIAM J. Discret. Math..

[2]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[3]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[4]  T. Berger,et al.  On instantaneous codes for zero-error coding of two correlated sources , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[5]  Noga Alon,et al.  A lower bound on the expected length of one-to-one codes , 1994, IEEE Trans. Inf. Theory.

[6]  Alon Orlitsky,et al.  Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.

[7]  Giuseppe Longo,et al.  Two-step encoding for finite sources , 1973, IEEE Trans. Inf. Theory.

[8]  Michelle Effros,et al.  Lossless and near-lossless source coding for multiple access networks , 2003, IEEE Trans. Inf. Theory.

[9]  Toby Berger,et al.  Multiterminal source encoding with encoder breakdown , 1989, IEEE Trans. Inf. Theory.

[10]  Noga Alon,et al.  Source coding and graph entropies , 1996, IEEE Trans. Inf. Theory.

[11]  Alon Orlitsky,et al.  Zero-Error Information Theory , 1998, IEEE Trans. Inf. Theory.

[12]  Gábor Simonyi,et al.  Graph entropy: A survey , 1993, Combinatorial Optimization.

[13]  K. Rose,et al.  On zero-error coding of correlated sources , 2002, Proceedings IEEE International Symposium on Information Theory,.

[14]  Kenneth Rose,et al.  On zero-error source coding with decoder side information , 2003, IEEE Trans. Inf. Theory.

[15]  Claude E. Shannon,et al.  The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.

[16]  A. Kh. Al Jabri,et al.  Zero-Error Codes for Correlated Information Sources , 1997, IMACC.

[17]  H. S. WITSENHAUSEN,et al.  The zero-error side information problem and chromatic numbers (Corresp.) , 1976, IEEE Trans. Inf. Theory.

[18]  Katalin Marton On the Shannon Capacity of Probabilistic Graphs , 1993, J. Comb. Theory, Ser. B.

[19]  A. Bonato,et al.  Graphs and Hypergraphs , 2022 .

[20]  Toby Berger,et al.  Multiterminal source encoding with one distortion criterion , 1989, IEEE Trans. Inf. Theory.

[21]  László Lovász,et al.  Entropy splitting for antiblocking corners and perfect graphs , 1990, Comb..