Consider a generalized multiterminal source coding system, where <inline-formula> <tex-math notation="LaTeX">$\binom{\ell }{ m}$ </tex-math></inline-formula> encoders, each observing a distinct size-<inline-formula> <tex-math notation="LaTeX">$m$ </tex-math></inline-formula> subset of <inline-formula> <tex-math notation="LaTeX">$\ell $ </tex-math></inline-formula> (<inline-formula> <tex-math notation="LaTeX">$\ell \geq 2$ </tex-math></inline-formula>) zero-mean unit-variance exchangeable Gaussian sources with correlation coefficient <inline-formula> <tex-math notation="LaTeX">$\rho $ </tex-math></inline-formula>, compress their observations in such a way that a joint decoder can reconstruct the sources within a prescribed mean squared error distortion based on the compressed data. The optimal rate-distortion performance of this system was previously known only for the two extreme cases <inline-formula> <tex-math notation="LaTeX">$m=\ell $ </tex-math></inline-formula> (the centralized case) and <inline-formula> <tex-math notation="LaTeX">$m=1$ </tex-math></inline-formula> (the distributed case), and except when <inline-formula> <tex-math notation="LaTeX">$\rho =0$ </tex-math></inline-formula>, the centralized system can achieve strictly lower compression rates than the distributed system under all non-trivial distortion constraints. Somewhat surprisingly, it is established in the present paper that the optimal rate-distortion performance of the afore-described generalized multiterminal source coding system with <inline-formula> <tex-math notation="LaTeX">$m\geq 2$ </tex-math></inline-formula> coincides with that of the centralized system for all distortions when <inline-formula> <tex-math notation="LaTeX">$\rho \leq 0$ </tex-math></inline-formula> and for distortions below an explicit positive threshold (depending on <inline-formula> <tex-math notation="LaTeX">$m$ </tex-math></inline-formula>) when <inline-formula> <tex-math notation="LaTeX">$\rho > 0$ </tex-math></inline-formula>. Moreover, when <inline-formula> <tex-math notation="LaTeX">$\rho > 0$ </tex-math></inline-formula>, the minimum achievable rate of generalized multiterminal source coding subject to an arbitrary positive distortion constraint <inline-formula> <tex-math notation="LaTeX">$d$ </tex-math></inline-formula> is shown to be within a finite gap (depending on <inline-formula> <tex-math notation="LaTeX">$m$ </tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">$d$ </tex-math></inline-formula>) from its centralized counterpart in the large <inline-formula> <tex-math notation="LaTeX">$\ell $ </tex-math></inline-formula> limit except for possibly the critical distortion <inline-formula> <tex-math notation="LaTeX">$d=1-\rho $ </tex-math></inline-formula>.
[1]
Aaron D. Wyner,et al.
The rate-distortion function for source coding with side information at the decoder
,
1976,
IEEE Trans. Inf. Theory.
[2]
Sui Tung,et al.
Multiterminal source coding (Ph.D. Thesis abstr.)
,
1978,
IEEE Trans. Inf. Theory.
[3]
Tsachy Weissman,et al.
Multiterminal Source Coding Under Logarithmic Loss
,
2011,
IEEE Transactions on Information Theory.
[4]
Yasutada Oohama,et al.
Indirect and Direct Gaussian Distributed Source Coding Problems
,
2014,
IEEE Transactions on Information Theory.
[5]
Jun Chen,et al.
Generalized Gaussian multiterminal source coding and probabilistic graphical models
,
2017,
2017 IEEE International Symposium on Information Theory (ISIT).
[6]
Jun Chen,et al.
Vector Gaussian Two-Terminal Source Coding
,
2013,
IEEE Transactions on Information Theory.
[7]
R. A. McDonald,et al.
Noiseless Coding of Correlated Information Sources
,
1973
.
[8]
Toby Berger,et al.
Successive Coding in Multiuser Information Theory
,
2007,
IEEE Transactions on Information Theory.
[9]
Y. Oohama.
Gaussian multiterminal source coding
,
1995,
Proceedings of 1995 IEEE International Symposium on Information Theory.
[10]
Pramod Viswanath,et al.
Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem
,
2006,
ISIT.
[11]
Thomas M. Cover,et al.
Elements of Information Theory
,
2005
.
[12]
Jun Chen,et al.
Vector Gaussian Multiterminal Source Coding
,
2012,
IEEE Transactions on Information Theory.
[13]
Robert M. Gray,et al.
Toeplitz and Circulant Matrices: A Review
,
2005,
Found. Trends Commun. Inf. Theory.
[14]
Jun Chen,et al.
On the Sum Rate of Gaussian Multiterminal Source Coding: New Proofs and Results
,
2010,
IEEE Transactions on Information Theory.
[15]
Zixiang Xiong,et al.
A New Sufficient Condition for Sum-Rate Tightness in Quadratic Gaussian Multiterminal Source Coding
,
2013,
IEEE Transactions on Information Theory.