Limit theorems for the sample entropy of hidden Markov chains
暂无分享,去创建一个
[1] A. C. Berry. The accuracy of the Gaussian approximation to the sum of independent variates , 1941 .
[2] Paul H. Siegel,et al. On the capacity of finite state channels and the analysis of convolutional accumulate-m codes , 2003 .
[3] Henry D. Pfister,et al. The Capacity of Finite-State Channels in the High-Noise Regime , 2010, ArXiv.
[4] Richard C. Bradley,et al. Introduction to strong mixing conditions , 2007 .
[5] P. Billingsley,et al. Probability and Measure , 1980 .
[6] Vladimir B. Balakirsky,et al. On the entropy rate of a hidden Markov model , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..
[7] Andrea J. Goldsmith,et al. Capacity of Finite State Channels Based on Lyapunov Exponents of Random Matrices , 2006, IEEE Transactions on Information Theory.
[8] Eytan Domany,et al. The Entropy of a Binary Hidden Markov Process , 2005, ArXiv.
[9] John J. Birch. Approximations for the Entropy for Functions of Markov Chains , 1962 .
[10] Philippe Jacquet,et al. Noisy Constrained Capacity , 2007, 2007 IEEE International Symposium on Information Theory.
[11] Brian H. Marcus,et al. Analyticity of Entropy Rate of Hidden Markov Chains , 2005, IEEE Transactions on Information Theory.
[12] H. Chernoff. A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations , 1952 .
[13] M. Loève. On Almost Sure Convergence , 1951 .
[14] Hans-Andrea Loeliger,et al. A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels , 2008, IEEE Transactions on Information Theory.
[15] Hans-Andrea Loeliger,et al. On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).
[16] Brian H. Marcus,et al. Derivatives of Entropy Rate in Special Families of Hidden Markov Chains , 2007, IEEE Transactions on Information Theory.
[17] Brian H. Marcus,et al. Asymptotics of Entropy Rate in Special Families of Hidden Markov Chains , 2010, IEEE Transactions on Information Theory.
[18] S. Bernstein. Sur l'extension du théoréme limite du calcul des probabilités aux sommes de quantités dépendantes , 1927 .
[19] R. C. Bradley. Basic Properties of Strong Mixing Conditions , 1985 .
[20] L. Arnold,et al. Evolutionary Formalism for Products of Positive Random Matrices , 1994 .
[21] Brian H. Marcus,et al. Concavity of mutual information rate for input-restricted finite-state memoryless channels at high SNR , 2009, 2009 IEEE International Symposium on Information Theory.
[22] Neri Merhav,et al. Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.
[23] Zhengyan Lin,et al. Limit Theory for Mixing Dependent Random Variables , 1997 .
[24] R. C. Bradley. Basic properties of strong mixing conditions. A survey and some open questions , 2005, math/0511078.
[25] Philippe Jacquet,et al. On the entropy of a hidden Markov process , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.
[26] Eytan Domany,et al. Taylor series expansions for the entropy rate of Hidden Markov Processes , 2005, 2006 IEEE International Conference on Communications.
[27] Venkat Anantharam,et al. An upper bound for the largest Lyapunov exponent of a Markovian product of nonnegative matrices , 2005, Theor. Comput. Sci..
[28] Tsachy Weissman,et al. Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..
[29] P. Lezaud. Chernoff-type bound for finite Markov chains , 1998 .
[30] Paul H. Siegel,et al. On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).
[31] V. Sharma,et al. Entropy and channel capacity in the regenerative setup with applications to Markov channels , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).
[32] Yuval Peres,et al. Entropy Rate for Hidden Markov Chains with rare transitions , 2010, ArXiv.
[33] Tsachy Weissman,et al. On the optimality of symbol-by-symbol filtering and denoising , 2004, IEEE Transactions on Information Theory.
[34] Tsachy Weissman,et al. New bounds on the entropy rate of hidden Markov processes , 2004, Information Theory Workshop.
[35] Jun Luo,et al. On the Entropy Rate of Hidden Markov Processes Observed Through Arbitrary Memoryless Channels , 2009, IEEE Transactions on Information Theory.
[36] Wei Zeng,et al. Simulation-Based Computation of Information Rates for Channels With Memory , 2006, IEEE Transactions on Information Theory.
[37] Brian H. Marcus,et al. Asymptotics of Input-Constrained Binary Symmetric Channel Capacity , 2008, ArXiv.