Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem

We determine the rate region of the quadratic Gaussian two-encoder source-coding problem. This rate region is achieved by a simple architecture that separates the analog and digital aspects of the compression. Furthermore, this architecture requires higher rates to send a Gaussian source than it does to send any other source with the same covariance. Our techniques can also be used to determine the sum-rate of some generalizations of this classical problem. Our approach involves coupling the problem to a quadratic Gaussian ldquoCEO problem.rdquo

[1]  Venkat Anantharam,et al.  An improved outer bound for multiterminal source coding , 2008, IEEE Transactions on Information Theory.

[2]  K. Schittkowski,et al.  NONLINEAR PROGRAMMING , 2022 .

[3]  Zhen Zhang,et al.  On the CEO problem , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[4]  Gene H. Golub,et al.  Matrix computations (3rd ed.) , 1996 .

[5]  Toby Berger,et al.  Multiterminal Source Coding with High Resolution , 1999, IEEE Trans. Inf. Theory.

[6]  Toby Berger,et al.  The CEO problem [multiterminal source coding] , 1996, IEEE Trans. Inf. Theory.

[7]  Giuseppe Longo,et al.  The information theory approach to communications , 1977 .

[8]  Venkat Anantharam,et al.  An Infeasibility Result for the Multiterminal Source-Coding Problem , 2005, ArXiv.

[9]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[10]  Yasutada Oohama Gaussian multiterminal source coding , 1997, IEEE Trans. Inf. Theory.

[11]  Yasutada Oohama Gaussian Multiterminal Source Coding with Several Side Informations at the Decoder , 2006, 2006 IEEE International Symposium on Information Theory.

[12]  Vinod M. Prabhakaran,et al.  Rate region of the quadratic Gaussian CEO problem , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[13]  丸山 徹 Convex Analysisの二,三の進展について , 1977 .

[14]  A. Lapidoth On the role of mismatch in rate distortion theory , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[15]  J. Woods,et al.  Probability and Random Processes with Applications to Signal Processing , 2001 .

[16]  Thomas M. Cover,et al.  A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources , 1971 .

[17]  L. Ozarow,et al.  On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.

[18]  Venkat Anantharam,et al.  An improved outer bound for the multiterminal source-coding problem , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[19]  Toby Berger,et al.  The quadratic Gaussian CEO problem , 1997, IEEE Trans. Inf. Theory.

[20]  Michael Gastpar,et al.  The Wyner-Ziv problem with multiple sources , 2004, IEEE Transactions on Information Theory.

[21]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[22]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[23]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[24]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Terminal Source-Coding Problem , 2005, ArXiv.

[25]  Yasutada Oohama,et al.  Rate-distortion theory for Gaussian multiterminal source coding systems with several side informations at the decoder , 2005, IEEE Transactions on Information Theory.

[26]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[27]  Hua Wang,et al.  Vector Gaussian Multiple Description With Individual and Central Receivers , 2005, IEEE Transactions on Information Theory.