Normal Laws for Two Entropy Estimators on Infinite Alphabets
暂无分享,去创建一个
[1] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[2] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[3] S. Zahl,et al. JACKKNIFING AN INDEX OF DIVERSITY , 1977 .
[4] Zhiyi Zhang,et al. Statistical Implications of Turing's Formula , 2016 .
[5] Thomas M. Cover,et al. Elements of information theory (2. ed.) , 2006 .
[6] A. Antos,et al. Convergence properties of functional estimates for discrete distributions , 2001 .
[7] B. Harris. The Statistical Estimation of Entropy in the Non-Parametric Case , 1975 .
[8] Thorsten Gerber,et al. Handbook Of Mathematical Functions , 2016 .
[9] Liam Paninski,et al. Estimation of Entropy and Mutual Information , 2003, Neural Computation.
[10] Zhiyi Zhang,et al. Bias Adjustment for a Nonparametric Entropy Estimator , 2013, Entropy.
[11] J. Geluk. Π-regular variation , 1981 .
[12] Xing Zhang,et al. A Normal Law for the Plug-in Estimator of Entropy , 2012, IEEE Transactions on Information Theory.