Entropy and the law of small numbers
暂无分享,去创建一个
[1] Neil O ' Connell. Information-Theoretic Proof of the Hewitt-Savage zero-one law , 2000 .
[2] Vydas Čekanavičius. On the convergence of Markov binomial to Poisson distribution , 2002 .
[3] J. Doob. Stochastic processes , 1953 .
[4] S. Bobkov,et al. On Modified Logarithmic Sobolev Inequalities for Bernoulli and Poisson Measures , 1998 .
[5] L. Gordon,et al. Two moments su ce for Poisson approx-imations: the Chen-Stein method , 1989 .
[6] V. Papathanasiou. Some characteristic properties of the Fisher information matrix via Cacoullos-type inequalities , 1993 .
[7] Amir Dembo,et al. Large Deviations Techniques and Applications , 1998 .
[8] Abram Kagan,et al. A discrete version of the Stam inequality and a characterization of the Poisson distribution , 2001 .
[9] O. Johnson. Entropy inequalities and the Central Limit Theorem , 2000 .
[10] Oliver Johnson. Convergence to the Poisson Distribution , 2004 .
[11] R. Durrett. Probability: Theory and Examples , 1993 .
[12] L. Saloff-Coste,et al. Lectures on finite Markov chains , 1997 .
[13] R. R. Bahadur. Some Limit Theorems in Statistics , 1987 .
[14] Iain M. Johnstone,et al. Une mesure d'information caractérisant la loi de Poisson , 1987 .
[15] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[16] Nelson M. Blachman,et al. The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.
[17] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[18] Byoung-Seon Choi,et al. Conditional limit theorems under Markov conditioning , 1987, IEEE Trans. Inf. Theory.
[19] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[20] A. Barron. Limits of information, Markov chains, and projection , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).
[21] P. Deheuvels,et al. A Semigroup Approach to Poisson Approximation , 1986 .
[22] Richard F. Serfozo. Correction: Compound Poisson Approximations for Sums of Random Variables , 1988 .
[23] Peter Harremoës,et al. Binomial and Poisson distributions as maximum entropy distributions , 2001, IEEE Trans. Inf. Theory.
[24] J. Bekenstein. The Limits of information , 2000, gr-qc/0009019.
[25] J. Linnik. An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions , 1959 .
[26] I. Csiszár. Sanov Property, Generalized $I$-Projection and a Conditional Limit Theorem , 1984 .
[27] David G. Kendall. Information theory and the limit-theorem for Markov chains and processes with a countable infinity of states , 1963 .
[28] Richard F. Serfozo. Compound Poisson Approximations for Sums of Random Variables , 1986 .
[29] A. Rényi. On Measures of Entropy and Information , 1961 .
[30] B. Gnedenko,et al. Random Summation: Limit Theorems and Applications , 1996 .
[31] L. L. Cam,et al. An approximation theorem for the Poisson binomial distribution. , 1960 .
[32] A. Barbour,et al. Poisson Approximation , 1992 .
[33] P. Harremoës. The information topology , 2002 .
[34] Chris A. J. Klaassen,et al. On an Inequality of Chernoff , 1985 .