On the equivalence between Stein identity and de Bruijn identity

This paper illustrates the equivalence between two fundamental results: Stein identity, originally proposed in the statistical estimation realm, and de Bruijn identity, considered for the first time in the information theory field. Two distinctive extensions of de Bruijn identity are presented as well. For arbitrary but fixed input and noise distributions, the first-order derivative of differential entropy is expressed by means of a function of the posterior mean, while the second-order derivative of differential entropy is manifested in terms of a function of Fisher information. Several applications exemplify the utility of the proposed results.

[1]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[2]  Sudheesh K. Kattumannil,et al.  On Stein's identity and its applications , 2009 .

[3]  S. Kay Fundamentals of statistical signal processing: estimation theory , 1993 .

[4]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[5]  Erchin Serpedin,et al.  On the Equivalence Between Stein and De Bruijn Identities , 2012, IEEE Transactions on Information Theory.

[6]  Tie Liu,et al.  An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems , 2006, IEEE Transactions on Information Theory.

[7]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[8]  Max H. M. Costa,et al.  A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.

[9]  M. Melamed Detection , 2021, SETI: Astronomy as a Contact Sport.

[10]  Harry L. Van Trees,et al.  Detection, Estimation, and Modulation Theory, Part I , 1968 .

[11]  J. Neyman,et al.  INADMISSIBILITY OF THE USUAL ESTIMATOR FOR THE MEAN OF A MULTIVARIATE NORMAL DISTRIBUTION , 2005 .

[12]  H. Hudson A Natural Identity for Exponential Families with Applications in Multiparameter Estimation , 1978 .

[13]  C. Morris Natural Exponential Families with Quadratic Variance Functions: Statistical Theory , 1983 .

[14]  Shlomo Shamai,et al.  Additive non-Gaussian noise channels: mutual information and conditional mean estimation , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[15]  William E. Strawderman,et al.  The Heat Equation and Stein's Identity: Connections, Applications , 2006 .