Universal Quantization for Separate Encodings and Joint Decoding of Correlated Sources

We consider the multi-user lossy source-coding problem for continuous alphabet sources. In a previous work, Ziv proposed a single-user universal coding scheme which uses uniform quantization with dither, followed by a lossless source encoder (entropy coder). In this paper, we generalize Ziv's scheme to the multi-user setting. For this generalized universal scheme, upper bounds are derived on the redundancies, defined as the differences between the actual rates and the closest corresponding rates on the boundary of the rate region. For the mean-square error distortion measure, it is shown that this scheme can achieve redundancies of no more than 0.754 b per sample for each user. These bounds are obtained without the knowledge of the multi-user rate region, which is an open problem in general. As a direct consequence of these results, the inner and outer bounds on the rate-distortion achievable region are obtained.

[1]  Venkat Anantharam,et al.  An improved outer bound for multiterminal source coding , 2008, IEEE Transactions on Information Theory.

[2]  Toby Berger,et al.  Multiterminal source encoding with one distortion criterion , 1989, IEEE Trans. Inf. Theory.

[3]  Toby Berger,et al.  All sources are nearly successively refinable , 2001, IEEE Trans. Inf. Theory.

[4]  Jun Chen,et al.  Vector Gaussian Multiterminal Source Coding , 2012, IEEE Transactions on Information Theory.

[5]  Toby Berger,et al.  The quadratic Gaussian CEO problem , 1997, IEEE Trans. Inf. Theory.

[6]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[7]  Yasutada Oohama,et al.  Rate-distortion theory for Gaussian multiterminal source coding systems with several side informations at the decoder , 2005, IEEE Transactions on Information Theory.

[8]  Meir Feder,et al.  Information rates of pre/post-filtered dithered quantizers , 1993, IEEE Trans. Inf. Theory.

[9]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[10]  Jacob Ziv,et al.  On universal quantization , 1985, IEEE Trans. Inf. Theory.

[11]  Meir Feder,et al.  On universal quantization by randomized uniform/lattice quantizers , 1992, IEEE Trans. Inf. Theory.

[12]  Neri Merhav,et al.  Zero-Delay and Causal Single-User and Multi-User Lossy Source Coding with Decoder Side Information , 2014, IEEE Transactions on Information Theory.

[13]  Toby Berger,et al.  The CEO problem [multiterminal source coding] , 1996, IEEE Trans. Inf. Theory.

[14]  John C. Kieffer,et al.  A unified approach to weak universal source coding , 1978, IEEE Trans. Inf. Theory.

[15]  Chao Tian,et al.  Multiple Description Quantization Via Gram–Schmidt Orthogonalization , 2005, IEEE Transactions on Information Theory.

[16]  Qian Zhao,et al.  On the rate loss of multiresolution source codes in the Wyner-Ziv setting , 2006, IEEE Transactions on Information Theory.

[17]  Abbas El Gamal,et al.  Achievable rates for multiple descriptions , 1982, IEEE Trans. Inf. Theory.

[18]  Michelle Effros,et al.  Improved bounds for the rate loss of multiresolution source codes , 2003, IEEE Trans. Inf. Theory.

[19]  Vinod M. Prabhakaran,et al.  Rate region of the quadratic Gaussian CEO problem , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[20]  Jun Chen,et al.  Vector Gaussian Two-Terminal Source Coding , 2013, IEEE Transactions on Information Theory.

[21]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2008, IEEE Transactions on Information Theory.

[22]  Rudolf Ahlswede,et al.  Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.

[23]  Herbert Gish,et al.  Asymptotically efficient quantizing , 1968, IEEE Trans. Inf. Theory.

[24]  Ram Zamir,et al.  Dithered lattice-based quantizers for multiple descriptions , 2002, IEEE Trans. Inf. Theory.

[25]  Neri Merhav,et al.  Universal Quantization for Separate Encodings and Joint Decoding of Correlated Sources , 2015, IEEE Trans. Inf. Theory.

[26]  Imre Csiszár,et al.  Towards a general theory of source networks , 1980, IEEE Trans. Inf. Theory.

[27]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[28]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[29]  Tsachy Weissman,et al.  Multiterminal Source Coding Under Logarithmic Loss , 2011, IEEE Transactions on Information Theory.

[30]  Toby Berger,et al.  Multiterminal Source Coding with High Resolution , 1999, IEEE Trans. Inf. Theory.

[31]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2006, ISIT.

[32]  Jun Chen,et al.  On the Sum Rate of Gaussian Multiterminal Source Coding: New Proofs and Results , 2010, IEEE Transactions on Information Theory.

[33]  Vittorio Castelli,et al.  Near sufficiency of random coding for two descriptions , 2006, IEEE Transactions on Information Theory.