A de Bruijn identity for symmetric stable laws

We show how some attractive information--theoretic properties of Gaussians pass over to more general families of stable densities. We define a new score function for symmetric stable laws, and use it to give a stable version of the heat equation. Using this, we derive a version of the de Bruijn identity, allowing us to write the derivative of relative entropy as an inner product of score functions. We discuss maximum entropy properties of symmetric stable densities.

[1]  O. Johnson Information Theory And The Central Limit Theorem , 2004 .

[2]  Gennadiy P. Chistyakov,et al.  Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem , 2011, 1104.3994.

[3]  R Colin Blyth Convolutions of Cauchy distributions , 1986 .

[4]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[5]  J. Linnik An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions , 1959 .

[6]  Assaf Naor,et al.  On the rate of convergence in the entropic central limit theorem , 2004 .

[7]  Sergio Verdú,et al.  A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.

[8]  M. Taqqu,et al.  Stable Non-Gaussian Random Processes : Stochastic Models with Infinite Variance , 1995 .

[9]  Sergey G. Bobkov,et al.  Entropy Power Inequality for the Rényi Entropy , 2015, IEEE Transactions on Information Theory.

[10]  K. Ball,et al.  Solution of Shannon's problem on the monotonicity of entropy , 2004 .

[11]  Wolfgang Gawronski,et al.  ON THE BELL-SHAPE OF STABLE DENSITIES , 1984 .

[12]  B. Gnedenko,et al.  Limit Distributions for Sums of Independent Random Variables , 1955 .

[13]  A. Barron ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .

[14]  S. G. Bobkov,et al.  Convergence to Stable Laws in Relative Entropy , 2011, 1104.4360.

[15]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[16]  Oliver Johnson,et al.  Discrete versions of the transport equation and the Shepp--Olkin conjecture , 2013, ArXiv.

[17]  Nelson M. Blachman,et al.  The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.

[18]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[19]  V. Zolotarev One-dimensional stable distributions , 1986 .

[20]  O. Johnson Log-concavity and the maximum entropy property of the Poisson distribution , 2006, math/0603647.

[21]  E. Lieb Proof of an entropy conjecture of Wehrl , 1978 .

[22]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..