On the Entropy of Compound Distributions on Nonnegative Integers
暂无分享,去创建一个
[1] W. J. Studden,et al. Tchebycheff Systems: With Applications in Analysis and Statistics. , 1967 .
[2] Oliver Johnson,et al. Entropy and the law of small numbers , 2005, IEEE Transactions on Information Theory.
[3] Mokshay M. Madiman,et al. Fisher Information, Compound Poisson Approximation, and the Poisson Channel , 2007, 2007 IEEE International Symposium on Information Theory.
[4] Mokshay M. Madiman,et al. Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.
[5] William Feller,et al. An Introduction to Probability Theory and Its Applications , 1967 .
[6] Oliver Johnson,et al. Thinning and the Law of Small Numbers , 2007, 2007 IEEE International Symposium on Information Theory.
[7] Louis H. Y. Chen,et al. Compound Poisson Approximation for Nonnegative Random Variables Via Stein's Method , 1992 .
[8] F. Steutel,et al. Infinite Divisibility of Probability Distributions on the Real Line , 2003 .
[9] L. Gleser. On the Distribution of the Number of Successes in Independent Trials , 1975 .
[10] Mokshay M. Madiman,et al. On the entropy and log-concavity of compound Poisson measures , 2008, ArXiv.
[11] Yaming Yu,et al. Relative log-concavity and a pair of triangle inequalities , 2010, 1010.2043.
[12] Harry H. Panjer,et al. Recursive Evaluation of a Family of Compound Distributions , 1981, ASTIN Bulletin.
[13] S. K. Katti. Infinite Divisibility of Integer-Valued Random Variables , 1967 .
[14] Feller William,et al. An Introduction To Probability Theory And Its Applications , 1950 .
[15] Yaming Yu,et al. On an inequality of Karlin and Rinott concerning weighted sums of i.i.d. random variables , 2008, Advances in Applied Probability.
[16] Ingram Olkin,et al. Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution , 1981 .
[17] K. Ball,et al. Solution of Shannon's problem on the monotonicity of entropy , 2004 .
[18] P. Mateev. On the Entropy of the Multinomial Distribution , 1978 .
[19] Alain Jean-Marie,et al. Stochastic comparisons for queueing models via random sums and intervals , 1992, Advances in Applied Probability.
[20] Bjorn G. Hansen,et al. On Log-Concave and Log-Convex Infinitely Divisible Sequences and Densities , 1988 .
[21] Charles M. Grinstead,et al. Introduction to probability , 1986, Statistics for the Behavioural Sciences.
[22] Yaming Yu,et al. Monotonic Convergence in an Information-Theoretic Law of Small Numbers , 2008, IEEE Transactions on Information Theory.
[23] J. Darroch. On the Distribution of the Number of Successes in Independent Trials , 1964 .
[24] Oliver Johnson,et al. Thinning and information projections , 2008, 2008 IEEE International Symposium on Information Theory.
[25] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[26] Stochastic Orders , 2008 .
[27] Ward Whitt,et al. Uniform conditional variability ordering of probability distributions , 1985 .
[28] Antonia Maria Tulino,et al. Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.
[29] Andrew D. Barbour,et al. Compound Poisson approximation: a user's guide , 2001 .
[30] O. Johnson. Log-concavity and the maximum entropy property of the Poisson distribution , 2006, math/0603647.
[31] Yaming Yu,et al. On the Maximum Entropy Properties of the Binomial Distribution , 2008, IEEE Transactions on Information Theory.
[32] Peter Harremoës,et al. Binomial and Poisson distributions as maximum entropy distributions , 2001, IEEE Trans. Inf. Theory.
[33] Mokshay M. Madiman,et al. Entropy, compound Poisson approximation, log-Sobolev inequalities and measure concentration , 2004, Information Theory Workshop.
[34] S. Karlin,et al. Entropy inequalities for classes of probability distributions I. The univariate case , 1981, Advances in Applied Probability.
[35] R. Pemantle. Towards a theory of negative dependence , 2000, math/0404095.