Consistency of the plug-in estimator of the entropy rate for ergodic processes
暂无分享,去创建一个
[1] Robert M. Gray,et al. The ergodic decomposition of stationary discrete random processes , 1974, IEEE Trans. Inf. Theory.
[2] Lukasz Debowski,et al. Mixing, Ergodic, and Nonergodic Processes With Rapidly Growing Information Between Blocks , 2011, IEEE Transactions on Information Theory.
[3] Boris Ryabko,et al. Applications of Universal Source Coding to Statistical Analysis of Time Series , 2008, ArXiv.
[4] Abraham Lempel,et al. A universal algorithm for sequential data compression , 1977, IEEE Trans. Inf. Theory.
[5] T. Cover,et al. A sandwich proof of the Shannon-McMillan-Breiman theorem , 1988 .
[6] K. Marton,et al. Entropy and the Consistent Estimation of Joint Distributions , 1993, Proceedings. IEEE International Symposium on Information Theory.
[7] Zhiyi Zhang,et al. Entropy Estimation in Turing's Perspective , 2012, Neural Computation.
[8] Łukasz Dębowski,et al. Regular Hilberg Processes: An Example of Processes With a Vanishing Entropy Rate , 2017, IEEE Transactions on Information Theory.
[9] Benjamin Weiss,et al. How Sampling Reveals a Process , 1990 .
[10] Xing Zhang,et al. A Normal Law for the Plug-in Estimator of Entropy , 2012, IEEE Transactions on Information Theory.
[11] Yanjun Han,et al. Minimax Estimation of Functionals of Discrete Distributions , 2014, IEEE Transactions on Information Theory.
[12] Lukasz Debowski,et al. Estimation of Entropy from Subword Complexity , 2016, Challenges in Computational Statistics and Data Mining.
[13] David L. Neuhoff,et al. Simplistic Universal Coding. , 1998, IEEE Trans. Inf. Theory.