暂无分享,去创建一个
[1] Liming Wu,et al. A new modified logarithmic Sobolev inequality for Poisson point processes and several applications , 2000 .
[2] Varun Jog,et al. The Entropy Power Inequality and Mrs. Gerber's Lemma for Groups of Order 2n , 2014, IEEE Trans. Inf. Theory.
[3] C. Villani. Topics in Optimal Transportation , 2003 .
[4] C. Villani. Optimal Transport: Old and New , 2008 .
[5] M. Ledoux,et al. Analysis and Geometry of Markov Diffusion Operators , 2013 .
[6] N. Papadatos,et al. Poisson approximation for a sum of dependent indicators: an alternative approach , 2002, Advances in Applied Probability.
[7] P. Meyer,et al. Sur les inegalites de Sobolev logarithmiques. I , 1982 .
[8] Mokshay M. Madiman,et al. Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures , 2009, Discret. Appl. Math..
[9] Aaron D. Wyner,et al. A theorem on the entropy of certain binary sequences and applications-I , 1973, IEEE Trans. Inf. Theory.
[10] Paul-Marie Samson,et al. Displacement convexity of entropy and related inequalities on graphs , 2012, Probability Theory and Related Fields.
[11] S. Kullback,et al. A lower bound for discrimination information in terms of variation (Corresp.) , 1967, IEEE Trans. Inf. Theory.
[12] Yann Brenier,et al. A computational fluid mechanics solution to the Monge-Kantorovich mass transfer problem , 2000, Numerische Mathematik.
[13] Oliver Johnson,et al. Concavity of entropy under thinning , 2009, 2009 IEEE International Symposium on Information Theory.
[14] Fraser Daly. Negative dependence and stochastic orderings , 2015, 1504.06493.
[15] E. Carlen,et al. Entropy production by block variable summation and central limit theorems , 1991 .
[16] Ingram Olkin,et al. Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution , 1981 .
[17] Jae Oh Woo,et al. A lower bound on the Rényi entropy of convolutions in the integers , 2014, 2014 IEEE International Symposium on Information Theory.
[18] Iain M. Johnstone,et al. Une mesure d'information caractérisant la loi de Poisson , 1987 .
[19] Yvik Swan,et al. Stein's density approach for discrete distributions and information inequalities , 2012, ArXiv.
[20] Yaming Yu,et al. Monotonic Convergence in an Information-Theoretic Law of Small Numbers , 2008, IEEE Transactions on Information Theory.
[21] Giuseppe Toscani,et al. The fractional Fisher information and the central limit theorem for stable laws , 2015, ArXiv.
[22] Luis A. Caffarelli,et al. Monotonicity Properties of Optimal Transportation¶and the FKG and Related Inequalities , 2000 .
[23] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[24] Oliver Johnson. A discrete log-Sobolev inequality under a Bakry-Emery type condition , 2015, ArXiv.
[25] B. Roos. KERSTAN'S METHOD FOR COMPOUND POISSON APPROXIMATION , 2003 .
[27] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[28] Assaf Naor,et al. On the rate of convergence in the entropic central limit theorem , 2004 .
[29] K. Chung,et al. Limit Distributions for Sums of Independent Random Variables. , 1955 .
[30] R. Pemantle. Towards a theory of negative dependence , 2000, math/0404095.
[31] A. Barbour,et al. Poisson Approximation , 1992 .
[32] Aaron D. Wyner,et al. A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.
[33] Calyampudi R. Rao,et al. Chapter 3: Differential and Integral Geometry in Statistical Inference , 1987 .
[34] E. Lieb. Proof of an entropy conjecture of Wehrl , 1978 .
[35] Oliver Johnson,et al. Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality , 2009, IEEE Transactions on Information Theory.
[36] Shlomo Shamai,et al. A binary analog to the entropy-power inequality , 1990, IEEE Trans. Inf. Theory.
[37] J. Linnik. An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions , 1959 .
[38] L. Ambrosio,et al. Gradient Flows: In Metric Spaces and in the Space of Probability Measures , 2005 .
[39] Djalil CHAFAÏ. BINOMIAL-POISSON ENTROPIC INEQUALITIES AND THE M/M/∞ QUEUE , 2006 .
[40] C. Tsallis. Possible generalization of Boltzmann-Gibbs statistics , 1988 .
[41] Abram Kagan,et al. A discrete version of the Stam inequality and a characterization of the Poisson distribution , 2001 .
[42] Yaming Yu,et al. On the Entropy of Compound Distributions on Nonnegative Integers , 2009, IEEE Transactions on Information Theory.
[43] A natural derivative on [0,n] and a binomial Poincaré inequality , 2011, 1107.0127.
[44] T. Cacoullos. On Upper and Lower Bounds for the Variance of a Function of a Random Variable , 1982 .
[45] A. Barron,et al. Fisher information inequalities and the central limit theorem , 2001, math/0111020.
[46] Varun Jog,et al. The Entropy Power Inequality and Mrs. Gerber's Lemma for groups of order 2n , 2013, 2013 IEEE International Symposium on Information Theory.
[47] Peter Harremoës,et al. Binomial and Poisson distributions as maximum entropy distributions , 2001, IEEE Trans. Inf. Theory.
[48] Emre Telatar,et al. A New Entropy Power Inequality for Integer-Valued Random Variables , 2014, IEEE Trans. Inf. Theory.
[49] V. Papathanasiou. Some characteristic properties of the Fisher information matrix via Cacoullos-type inequalities , 1993 .
[50] Yaming Yu,et al. Sharp Bounds on the Entropy of the Poisson Law and Related Quantities , 2010, IEEE Transactions on Information Theory.
[51] O. Johnson. Information Theory And The Central Limit Theorem , 2004 .
[52] F. Baccelli,et al. Characterization of Poisson Processes , 1987 .
[53] K. Ball,et al. Solution of Shannon's problem on the monotonicity of entropy , 2004 .
[54] G. Crooks. On Measures of Entropy and Information , 2015 .
[55] Dario Cordero-Erausquin,et al. Some Applications of Mass Transport to Gaussian-Type Inequalities , 2002 .
[56] Oliver Johnson,et al. Entropy and the law of small numbers , 2005, IEEE Transactions on Information Theory.
[57] Oliver Johnson,et al. A proof of the Shepp-Olkin entropy concavity conjecture , 2015, ArXiv.
[58] A. Rényi. On Measures of Entropy and Information , 1961 .
[59] Peng Xu,et al. Forward and Reverse Entropy Power Inequalities in Convex Geometry , 2016, ArXiv.
[60] Ryoichi Shimizu,et al. On Fisher’s Amount of Information for Location Family , 1975 .
[61] Nelson M. Blachman,et al. The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.
[62] Gustavo Posta,et al. Convex entropy decay via the Bochner-Bakry-Emery approach , 2007 .
[63] Mokshay M. Madiman,et al. Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.
[64] J. Maas,et al. Ricci Curvature of Finite Markov Chains via Convexity of the Entropy , 2011, 1111.2687.
[65] H. Chernoff. A Note on an Inequality Involving the Normal Distribution , 1981 .
[66] J. Shanthikumar,et al. Multivariate Stochastic Orders , 2007 .
[67] C. Villani,et al. Ricci curvature for metric-measure spaces via optimal transport , 2004, math/0412127.
[68] B. Gnedenko,et al. Random Summation: Limit Theorems and Applications , 1996 .
[69] Thomas M. Liggett. Ultra Logconcave Sequences and Negative Dependence , 1997, J. Comb. Theory, Ser. A.
[70] Karl-Theodor Sturm,et al. On the geometry of metric measure spaces , 2006 .
[71] P. Mateev. On the Entropy of the Multinomial Distribution , 1978 .
[72] Rene F. Swarttouw,et al. Orthogonal polynomials , 2020, NIST Handbook of Mathematical Functions.
[73] Yaming Yu,et al. On the Maximum Entropy Properties of the Binomial Distribution , 2008, IEEE Transactions on Information Theory.
[74] S. G. Bobkov,et al. Convergence to Stable Laws in Relative Entropy , 2011, 1104.4360.
[75] L. Gross. LOGARITHMIC SOBOLEV INEQUALITIES. , 1975 .
[76] Hans S. Witsenhausen. Some aspects of convexity useful in information theory , 1980, IEEE Trans. Inf. Theory.
[77] Assaf Naor,et al. Entropy jumps in the presence of a spectral gap , 2003 .
[78] S. G. Bobkov,et al. Fisher information and convergence to stable laws , 2012, 1205.3637.
[79] Gennadiy P. Chistyakov,et al. Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem , 2011, 1104.3994.
[80] Antonia Maria Tulino,et al. Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.
[81] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[82] J. Kingman. Uses of Exchangeability , 1978 .
[83] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[84] Alice Guionnet,et al. Lectures on Logarithmic Sobolev Inequalities , 2003 .
[85] Yvik Swan,et al. Local Pinsker Inequalities via Stein's Discrete Density Approach , 2012, IEEE Transactions on Information Theory.
[86] W. Gangbo,et al. Constrained steepest descent in the 2-Wasserstein metric , 2003, math/0312063.
[87] S. Bobkov,et al. On Modified Logarithmic Sobolev Inequalities for Bernoulli and Poisson Measures , 1998 .
[88] Karl-Theodor Sturm,et al. On the geometry of metric measure spaces. II , 2006 .
[89] Petter Brändén,et al. Iterated sequences and the geometry of zeros , 2011 .
[90] Naresh Sharma,et al. Entropy power inequality for a family of discrete random variables , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[91] Christophe Vignat,et al. AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY , 2003 .
[92] Oliver Johnson,et al. Discrete versions of the transport equation and the Shepp--Olkin conjecture , 2013, ArXiv.
[93] Gennadiy P. Chistyakov,et al. Berry–Esseen bounds in the entropic central limit theorem , 2011, 1105.4119.
[94] Amir Dembo,et al. Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.
[95] O. Johnson,et al. Bounds on the Poincaré constant under negative dependence , 2008, 0801.2112.
[96] D. Walkup. Pólya sequences, binomial convolution and the union of random sets , 1976, Journal of Applied Probability.
[97] Y. Derriennic,et al. Entropie, theoremes limite et marches aleatoires , 1986 .
[98] Calyampudi R. Rao. On the distance between two populations , 1949 .
[99] Chris A. J. Klaassen,et al. On an Inequality of Chernoff , 1985 .
[100] Venkat Anantharam. Counterexamples to a Proposed Stam Inequality on Finite Groups , 2010, IEEE Transactions on Information Theory.
[101] A. A. Borovkov,et al. On an Inequality and a Related Characterization of the Normal Distribution , 1984 .
[102] Oliver Johnson,et al. Thinning, Entropy, and the Law of Thin Numbers , 2009, IEEE Transactions on Information Theory.
[103] Oliver Johnson,et al. A de Bruijn identity for symmetric stable laws , 2013, ArXiv.
[104] Shun-ichi Amari,et al. Methods of information geometry , 2000 .
[105] O. Johnson. Log-concavity and the maximum entropy property of the Poisson distribution , 2006, math/0603647.
[106] Mokshay M. Madiman,et al. Compound Poisson Approximation via Information Functionals , 2010, ArXiv.