Entropic CLT for Smoothed Convolutions and Associated Entropy Bounds
暂无分享,去创建一个
[1] Christopher M. Bishop,et al. Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .
[2] Hyunjoong Kim,et al. Functional Analysis I , 2017 .
[3] George Livadiotis,et al. High Density Nodes in the Chaotic Region of 1D Discrete Maps , 2018, Entropy.
[4] A. Barron,et al. Fisher information inequalities and the central limit theorem , 2001, math/0111020.
[5] Elisabeth M. Werner,et al. Divergence for s-concave and log concave functions , 2013, 1307.5409.
[6] Gennadiy P. Chistyakov,et al. Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem , 2011, 1104.3994.
[7] Peng Xu,et al. Forward and Reverse Entropy Power Inequalities in Convex Geometry , 2016, ArXiv.
[8] Murti V. Salapaka,et al. Error Bounds on a Mixed Entropy Inequality , 2018, International Symposium on Information Theory.
[9] L. Miclo. Notes on the Speed of Entropic Convergence in the Central Limit Theorem , 2003 .
[10] O. Guédon,et al. Functional Versions of Lp-Affine Surface Area and Entropy Inequalities , 2014, 1402.3250.
[11] Nasser M. Nasrabadi,et al. Pattern Recognition and Machine Learning , 2006, Technometrics.
[12] C. Villani. Topics in Optimal Transportation , 2003 .
[13] Jean Bourgain,et al. ON HIGH DIMENSIONAL MAXIMAL FUNCTIONS ASSOCIATED TO CONVEX BODIES , 1986 .
[14] Liyao Wang,et al. Beyond the Entropy Power Inequality, via Rearrangements , 2013, IEEE Transactions on Information Theory.
[15] Sergey G. Bobkov,et al. Entropic approach to E. Rio’s central limit theorem for W2 transport distance , 2013 .
[16] Sergey G. Bobkov,et al. The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions , 2010, IEEE Transactions on Information Theory.
[17] S. Bobkov,et al. Local limit theorems for smoothed Bernoulli and other convolutions , 2019, Teoriya Veroyatnostei i ee Primeneniya.
[18] S. Bobkov,et al. Reverse Brunn–Minkowski and reverse entropy power inequalities for convex measures , 2011, 1109.5287.
[19] Devavrat Shah,et al. On entropy for mixtures of discrete and continuous variables , 2006, ArXiv.
[20] Varun Jog,et al. An Entropy Inequality for Symmetric Random Variables , 2018, 2018 IEEE International Symposium on Information Theory (ISIT).
[21] Mokshay Madiman,et al. Entropy versus variance for symmetric log-concave random variables and related problems , 2018, ArXiv.
[22] M. Talagrand. Transportation cost for Gaussian and other product measures , 1996 .
[23] Assaf Naor,et al. On the rate of convergence in the entropic central limit theorem , 2004 .
[24] Victoria Kostina,et al. A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications , 2017, Entropy.
[25] Peter Harremoës,et al. Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.
[26] E. Lieb. Some Convexity and Subadditivity Properties of Entropy , 1975 .
[27] Amir Dembo,et al. Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.
[28] Lan V. Truong,et al. Support Recovery in the Phase Retrieval Model: Information-Theoretic Fundamental Limit , 2019, IEEE Transactions on Information Theory.
[29] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[30] Local Limit Theorems for Smoothed Bernoulli and Other Convolutions , 2020 .
[31] Murti V. Salapaka,et al. Relationships between certain f -divergences , 2019, 2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[32] O. Johnson. Information Theory And The Central Limit Theorem , 2004 .
[33] Alex Zhai,et al. The CLT in high dimensions: Quantitative bounds via martingale embedding , 2018, The Annals of Probability.
[34] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[35] Alfred O. Hero,et al. Bounds on Variance for Unimodal Distributions , 2015, IEEE Transactions on Information Theory.