Entropy power inequality for a family of discrete random variables
暂无分享,去创建一个
[1] E. Lukács. Applications of Faá Di Bruno's Formula in Mathematical Statistics , 1955 .
[2] R. Guy,et al. The Book of Numbers , 2019, The Crimean Karaim Bible.
[3] Oliver Johnson,et al. Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality , 2009, IEEE Transactions on Information Theory.
[4] Flemming Topsøe,et al. Some inequalities for information divergence and related measures of discrimination , 2000, IEEE Trans. Inf. Theory.
[5] W. Rudin. Principles of mathematical analysis , 1964 .
[6] Amir Dembo,et al. Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.
[7] Antonia Maria Tulino,et al. Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.
[8] Christophe Vignat,et al. AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY , 2003 .
[9] Max H. M. Costa,et al. A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.
[10] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[11] Charles Knessl,et al. Integral representations and asymptotic expansions for Shannon and Renyi entropies , 1998 .
[12] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[13] Sergio Verdú,et al. A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.
[14] Yaming Yu,et al. Monotonic Convergence in an Information-Theoretic Law of Small Numbers , 2008, IEEE Transactions on Information Theory.
[15] K. Ball,et al. Solution of Shannon's problem on the monotonicity of entropy , 2004 .
[16] Nelson M. Blachman,et al. The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.
[17] Mokshay M. Madiman,et al. Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.
[18] Shlomo Shamai,et al. A binary analog to the entropy-power inequality , 1990, IEEE Trans. Inf. Theory.
[19] Oliver Johnson,et al. Thinning, Entropy, and the Law of Thin Numbers , 2009, IEEE Transactions on Information Theory.
[20] O. Johnson. Log-concavity and the maximum entropy property of the Poisson distribution , 2006, math/0603647.