On an Extremal Data Processing Inequality for long Markov Chains

We pose the following extremal conjecture: Let X,Y be jointly Gaussian random variables with linear correlation ρ. For any random variables U, V for which U,X, Y, V form a Markov chain, in that order, we conjecture that: 2−2[I(X;V )+I(Y ;U)] ≥ (1− ρ2)2−2I(U ;V ) + ρ22−2[I(X;U)+I(Y ;V . By letting V be constant, we see that this inequality generalizes a well-known extremal result proved by Oohama in his work on the quadratic Gaussian one-helper problem. If valid, the conjecture would have some interesting consequences. For example, the converse for the quadratic Gaussian two-encoder source coding problem would follow from the converse for multiterminal source coding under logarithmic loss, thus unifying the two results under a common framework. Although the conjecture remains open, we discuss both analytical and numerical evidence supporting its validity.

[1]  Tsachy Weissman,et al.  Multiterminal Source Coding Under Logarithmic Loss , 2011, IEEE Transactions on Information Theory.

[2]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2006, ISIT.

[3]  Y. Oohama Gaussian multiterminal source coding , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[4]  Tie Liu,et al.  An Extremal Inequality Motivated by Multiterminal Information Theoretic Problems , 2006, ISIT.

[5]  Venkat Anantharam,et al.  On Maximal Correlation, Hypercontractivity, and the Data Processing Inequality studied by Erkip and Cover , 2013, ArXiv.

[6]  Thomas A. Courtade Information masking and amplification: The source coding setting , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[7]  Nelson M. Blachman,et al.  The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.

[8]  Maxim Raginsky,et al.  Logarithmic Sobolev inequalities and strong data processing theorems for discrete channels , 2013, 2013 IEEE International Symposium on Information Theory.