Distributed Rate-Distortion With Common Components

We describe a scheme for rate-distortion with distributed encoding in which the sources to be compressed contain a common component. We show that this scheme is optimal in some situations and that it strictly improves upon existing schemes, which do not make full use of common components. This establishes that independent quantization followed by independent binning is not optimal for the two-encoder problem with a distortion constraint on one source. We also show that independent quantization and binning is suboptimal for the three-encoder problem in which the goal is to reproduce one of the sources losslessly. This provides a counterexample that is fundamentally different from one provided earlier by Körner and Marton. The proofs rely on the binary analogue of the entropy power inequality and the existence of a rate loss for the binary symmetric Wyner-Ziv problem.

[1]  Toby Berger,et al.  An upper bound on the sum-rate distortion function and its corresponding rate allocation schemes for the CEO problem , 2004, IEEE Journal on Selected Areas in Communications.

[2]  Te Sun Han,et al.  A unified achievable rate region for a general class of multiterminal source coding systems , 1980, IEEE Trans. Inf. Theory.

[3]  Toby Berger,et al.  Sequential coding of correlated sources , 2000, IEEE Trans. Inf. Theory.

[4]  Aaron B. Wagner,et al.  Improved Slepian-Wolf exponents via Witsenhausen's rate , 2009, 2009 IEEE International Symposium on Information Theory.

[5]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[6]  R. A. McDonald,et al.  Noiseless Coding of Correlated Information Sources , 1973 .

[7]  Michael Gastpar,et al.  The Wyner-Ziv problem with multiple sources , 2004, IEEE Transactions on Information Theory.

[8]  P. Viswanath,et al.  The Gaussian Many-Help-One Distributed Source Coding Problem , 2008, 2006 IEEE Information Theory Workshop - ITW '06 Chengdu.

[9]  Toby Berger,et al.  Rate-distortion for correlated sources with partially separated encoders , 1982, IEEE Trans. Inf. Theory.

[10]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.

[11]  Richard E. Blahut,et al.  Partial side information problem: Equivalence of two inner bounds , 2008, 2008 42nd Annual Conference on Information Sciences and Systems.

[12]  Toby Berger,et al.  Multiterminal source encoding with one distortion criterion , 1989, IEEE Trans. Inf. Theory.

[13]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[14]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .

[15]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2006, ISIT.

[16]  Te Sun Han,et al.  A dichotomy of functions F(X, Y) of correlated sources (X, Y) , 1987, IEEE Trans. Inf. Theory.

[17]  Aaron D. Wyner,et al.  On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.

[18]  Haim H. Permuter,et al.  Coordination Capacity , 2009, IEEE Transactions on Information Theory.

[19]  Yasutada Oohama Gaussian Multiterminal Source Coding with Several Side Informations at the Decoder , 2006, 2006 IEEE International Symposium on Information Theory.

[20]  Haim H. Permuter,et al.  Two-Way Source Coding With a Helper , 2008, IEEE Transactions on Information Theory.

[21]  Y. Oohama Gaussian multiterminal source coding , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[22]  M. Loève,et al.  Elementary Probability Theory , 1977 .

[23]  T. Han,et al.  A Dichotomy of Functions F ( X , Y ) o f Corre lated Sources ( X , Y ) from the Viewpoint o f the Achievable Rate Region , .

[24]  Masoud Salehi,et al.  Multiple access channels with arbitrarily correlated sources , 1980, IEEE Trans. Inf. Theory.

[25]  R. Tyrrell Rockafellar,et al.  Convex Analysis , 1970, Princeton Landmarks in Mathematics and Physics.

[26]  Venkat Anantharam,et al.  An improved outer bound for multiterminal source coding , 2008, IEEE Transactions on Information Theory.

[27]  Toby Berger,et al.  The CEO problem [multiterminal source coding] , 1996, IEEE Trans. Inf. Theory.

[28]  Suhas N. Diggavi,et al.  Wireless Network Information Flow: A Deterministic Approach , 2009, IEEE Transactions on Information Theory.

[29]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[30]  David Tse,et al.  Multiaccess Fading Channels-Part I: Polymatroid Structure, Optimal Resource Allocation and Throughput Capacities , 1998, IEEE Trans. Inf. Theory.

[31]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-I , 1973, IEEE Trans. Inf. Theory.

[32]  János Körner,et al.  How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[33]  Sui Tung,et al.  Multiterminal source coding (Ph.D. Thesis abstr.) , 1978, IEEE Trans. Inf. Theory.

[34]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[35]  Toby Berger,et al.  An upper bound on the rate distortion function for source coding with partial side information at the decoder , 1979, IEEE Trans. Inf. Theory.

[36]  Rudolf Ahlswede,et al.  Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.

[37]  Jun Chen,et al.  A semicontinuity theorem and its application to network source coding , 2008, 2008 IEEE International Symposium on Information Theory.

[38]  Tsachy Weissman,et al.  Multiterminal Source Coding With Action-Dependent Side Information , 2011, IEEE Transactions on Information Theory.

[39]  Etienne Perron,et al.  Cooperative Source Coding with Encoder Breakdown , 2007, 2007 IEEE International Symposium on Information Theory.

[40]  H. Witsenhausen ON SEQUENCES OF PAIRS OF DEPENDENT RANDOM VARIABLES , 1975 .

[41]  Aaron B. Wagner,et al.  Improved Source Coding Exponents via Witsenhausen's Rate , 2011, IEEE Transactions on Information Theory.

[42]  Toby Berger,et al.  Robust Distributed Source Coding , 2006, IEEE Transactions on Information Theory.