The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities
暂无分享,去创建一个
[1] H. Rubin,et al. Asymptotic Distribution of Symmetric Statistics , 1980 .
[2] A. Takemura. Tensor Analysis of ANOVA Decomposition , 1983 .
[3] A. Barron,et al. Fisher information inequalities and the central limit theorem , 2001, math/0111020.
[4] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[5] Hanne Schultz,et al. Semicircularity, Gaussianity and Monotonicity of Entropy , 2005, math/0512492.
[6] R. V. Mises. On the Asymptotic Distribution of Differentiable Statistical Functions , 1947 .
[7] Ryoichi Shimizu,et al. On Fisher’s Amount of Information for Location Family , 1975 .
[8] Nelson M. Blachman,et al. The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.
[9] B. Efron,et al. The Jackknife Estimate of Variance , 1981 .
[10] Mokshay M. Madiman,et al. Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.
[11] K. Ball,et al. Solution of Shannon's problem on the monotonicity of entropy , 2004 .
[12] Amir Dembo,et al. Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.
[13] D. Shlyakhtenko. A free analogue of Shannon's problem on monotonicity of entropy , 2005 .
[14] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[15] R. Mukerjee,et al. A calculus for factorial arrangements , 1989 .
[16] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[17] W. Hoeffding. A Class of Statistics with Asymptotically Normal Distribution , 1948 .
[18] F. Barthe,et al. Semicircularity, Gaussianity and Monotonicity of Entropy , 2005 .