Lower Bound for Derivatives of Costa's Differential Entropy

Several conjectures concern the lower bound for the differential entropy $H(X_t)$ of an $n$-dimensional random vector $X_t$ introduced by Costa. Cheng and Geng conjectured that $H(X_t)$ is completely monotone, that is, $C_1(m,n): (-1)^{m+1}(d^m/d^m t)H(X_t)\ge0$. McKean conjectured that Gaussian $X_{Gt}$ achieves the minimum of $(-1)^{m+1}(d^m/d^m t)H(X_t)$ under certain conditions, that is, $C_2(m,n): (-1)^{m+1}(d^m/d^m t)H(X_t)\ge(-1)^{m+1}(d^m/d^m t)H(X_{Gt})$. McKean's conjecture was only considered in the univariate case before: $C_2(1,1)$ and $C_2(2,1)$ were proved by McKean and $C_2(i,1),i=3,4,5$ were proved by Zhang-Anantharam-Geng under the log-concave condition. In this paper, we prove $C_2(1,n)$, $C_2(2,n)$ and observe that McKean's conjecture might not be true for $n>1$ and $m>2$. We further propose a weaker version $C_3(m,n): (-1)^{m+1}(d^m/d^m t)H(X_t)\ge(-1)^{m+1}\frac{1}{n}(d^m/d^m t)H(X_{Gt})$ and prove $C_3(3,2)$, $C_3(3,3)$, $C_3(3,4)$, $C_3(4,2)$ under the log-concave condition. A systematical procedure to prove $C_l(m,n)$ is proposed based on semidefinite programming and the results mentioned above are proved using this procedure.

[1]  Sergio Verdú,et al.  A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.

[2]  E. Lieb Proof of an entropy conjecture of Wehrl , 1978 .

[3]  Giuseppe Toscani A concavity property for the reciprocal of Fisher information and its consequences on Costa's EPI , 2014, ArXiv.

[4]  H. McKean Speed of approach to equilibrium for Kac's caricature of a Maxwellian gas , 1966 .

[5]  Gebräuchliche Fertigarzneimittel,et al.  V , 1893, Therapielexikon Neurologie.

[6]  Anton van den Hengel,et al.  Semidefinite Programming , 2014, Computer Vision, A Reference Guide.

[7]  Max H. M. Costa,et al.  A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.

[8]  Max H. M. Costa,et al.  On the Gaussian interference channel , 1985, IEEE Trans. Inf. Theory.

[9]  Fan Cheng,et al.  Higher Order Derivatives in Costa’s Entropy Power Inequality , 2014, IEEE Transactions on Information Theory.

[10]  Venkat Anantharam,et al.  Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities † , 2018, Entropy.

[11]  Claude E. Shannon,et al.  A Mathematical Theory of Communications , 1948 .

[12]  Stephen P. Boyd,et al.  Semidefinite Programming , 1996, SIAM Rev..

[13]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[14]  Amir Dembo,et al.  Simple proof of the concavity of the entropy power with respect to Gaussian noise , 1989, IEEE Trans. Inf. Theory.

[15]  Liyao Wang,et al.  A new approach to the entropy power inequality, via rearrangements , 2013, 2013 IEEE International Symposium on Information Theory.

[16]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[17]  Cédric Villani,et al.  A short proof of the "Concavity of entropy power" , 2000, IEEE Trans. Inf. Theory.

[18]  Olivier Rioul,et al.  Information Theoretic Proofs of Entropy Power Inequalities , 2007, IEEE Transactions on Information Theory.

[19]  Patrick P. Bergmans,et al.  A simple converse for broadcast channels with additive white Gaussian noise (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[20]  Tie Liu,et al.  An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems , 2006, IEEE Transactions on Information Theory.

[21]  장윤희,et al.  Y. , 2003, Industrial and Labor Relations Terms.

[22]  Laigang Guo,et al.  Prove Costa's Entropy Power Inequality and High Order Inequality for Differential Entropy with Semidefinite Programming , 2020, ArXiv.

[23]  Nelson M. Blachman,et al.  The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.

[24]  Erchin Serpedin,et al.  Gaussian Assumption: The Least Favorable but the Most Useful [Lecture Notes] , 2012, IEEE Signal Processing Magazine.