Limit Theorems in Hidden Markov Models
暂无分享,去创建一个
[1] Zhengyan Lin,et al. Limit Theory for Mixing Dependent Random Variables , 1997 .
[2] L. Arnold,et al. Evolutionary Formalism for Products of Positive Random Matrices , 1994 .
[3] R. Douc,et al. CONSISTENCY OF THE MAXIMUM LIKELIHOOD ESTIMATOR FOR GENERAL HIDDEN MARKOV MODELS , 2009, 0912.4480.
[4] E. Seneta. Non-negative Matrices and Markov Chains , 2008 .
[5] Eytan Domany,et al. The Entropy of a Binary Hidden Markov Process , 2005, ArXiv.
[6] Yuval Peres,et al. A note on a complex Hilbert metric with application to domain of analyticity for entropy rate of hidden Markov processes , 2009, ArXiv.
[7] Guangyue Han. Limit theorems for the sample entropy of hidden Markov chains , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[8] Wei Zeng,et al. Simulation-Based Computation of Information Rates for Channels With Memory , 2006, IEEE Transactions on Information Theory.
[9] Sandro Vaienti,et al. FLUCTUATIONS OF THE METRIC ENTROPY FOR MIXING MEASURES , 2004 .
[10] Tsachy Weissman,et al. Entropy of Hidden Markov Processes and Connections to Dynamical Systems: Papers from the Banff International Research Station Workshop , 2011 .
[11] Henry D. Pfister,et al. The Capacity of Finite-State Channels in the High-Noise Regime , 2010, ArXiv.
[12] R. Douc,et al. Asymptotic properties of the maximum likelihood estimator in autoregressive models with Markov regime , 2004, math/0503681.
[13] Richard C. Bradley,et al. Introduction to strong mixing conditions , 2007 .
[14] R. Douc,et al. Asymptotics of the maximum likelihood estimator for general hidden Markov models , 2001 .
[15] I. Ibragimov,et al. Some Limit Theorems for Stationary Processes , 1962 .
[16] Jun Luo,et al. On the Entropy Rate of Hidden Markov Processes Observed Through Arbitrary Memoryless Channels , 2009, IEEE Transactions on Information Theory.
[17] Hans-Andrea Loeliger,et al. A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels , 2008, IEEE Transactions on Information Theory.
[18] Hans-Andrea Loeliger,et al. On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).
[19] T. Rydén. Consistent and Asymptotically Normal Parameter Estimates for Hidden Markov Models , 1994 .
[20] Brian H. Marcus,et al. Asymptotics of Input-Constrained Binary Symmetric Channel Capacity , 2008, ArXiv.
[21] V. V. Petrov. Limit Theorems of Probability Theory: Sequences of Independent Random Variables , 1995 .
[22] L. Gerencsér,et al. Recursive estimation of Hidden Markov Models , 2005, Proceedings of the 44th IEEE Conference on Decision and Control.
[23] A. C. Berry. The accuracy of the Gaussian approximation to the sum of independent variates , 1941 .
[24] R. C. Bradley. Basic properties of strong mixing conditions. A survey and some open questions , 2005, math/0511078.
[25] Valerie Isham,et al. Non‐Negative Matrices and Markov Chains , 1983 .
[26] L. Baum,et al. Statistical Inference for Probabilistic Functions of Finite State Markov Chains , 1966 .
[27] N. Haydn,et al. The Central Limit Theorem for uniformly strong mixing measures , 2009, 0903.1325.
[28] En-Hui Yang,et al. Non-asymptotic equipartition properties for independent and identically distributed sources , 2012, 2012 Information Theory and Applications Workshop.
[29] V. V. Petrov. On a Relation Between an Estimate of the Remainder in the Central Limit Theorem and the Law of the Iterated Logarithm , 1966 .
[30] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[31] Andrea J. Goldsmith,et al. Capacity of Finite State Channels Based on Lyapunov Exponents of Random Matrices , 2006, IEEE Transactions on Information Theory.
[32] B. Leroux. Maximum-likelihood estimation for hidden Markov models , 1992 .
[33] Eytan Domany,et al. Taylor series expansions for the entropy rate of Hidden Markov Processes , 2005, 2006 IEEE International Conference on Communications.
[34] Thomas M. Cover,et al. Elements of information theory (2. ed.) , 2006 .
[35] Yuval Peres,et al. Entropy Rate for Hidden Markov Chains with rare transitions , 2010, ArXiv.
[36] P. Bickel,et al. Asymptotic normality of the maximum-likelihood estimator for general hidden Markov models , 1998 .
[37] Tsachy Weissman,et al. Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..
[38] Paul H. Siegel,et al. On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).
[39] Brian H. Marcus,et al. Derivatives of Entropy Rate in Special Families of Hidden Markov Chains , 2007, IEEE Transactions on Information Theory.
[40] Brian H. Marcus,et al. Asymptotics of Entropy Rate in Special Families of Hidden Markov Chains , 2010, IEEE Transactions on Information Theory.
[41] Ioannis Kontoyiannis,et al. Asymptotic Recurrence and Waiting Times for Stationary Processes , 1998 .
[42] Thomas M. Cover,et al. Elements of Information Theory: Cover/Elements of Information Theory, Second Edition , 2005 .
[43] Brian H. Marcus,et al. Entropy rate of continuous-state hidden Markov chains , 2010, 2010 IEEE International Symposium on Information Theory.
[44] W. Philipp,et al. Almost sure invariance principles for partial sums of weakly dependent random variables , 1975 .
[45] Laurent Mevel,et al. Asymptotical statistics of misspecified hidden Markov models , 2004, IEEE Transactions on Automatic Control.
[46] Venkat Anantharam,et al. An upper bound for the largest Lyapunov exponent of a Markovian product of nonnegative matrices , 2005, Theor. Comput. Sci..
[47] Brian H. Marcus,et al. Concavity of mutual information rate for input-restricted finite-state memoryless channels at high SNR , 2009, 2009 IEEE International Symposium on Information Theory.
[48] Neri Merhav,et al. Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.
[49] Peter J. Bickel,et al. Inference in hidden Markov models I: Local asymptotic normality in the stationary case , 1996 .
[50] Brian H. Marcus,et al. Analyticity of Entropy Rate of Hidden Markov Chains , 2005, IEEE Transactions on Information Theory.
[51] H. Chernoff. A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations , 1952 .
[52] M. Loève. On Almost Sure Convergence , 1951 .
[53] S. Bernstein. Sur l'extension du théoréme limite du calcul des probabilités aux sommes de quantités dépendantes , 1927 .
[54] M. Kh. Reznik. The Law of the Iterated Logarithm for Some Classes of Stationary Processes , 1968 .
[55] J. Norris. Appendix: probability and measure , 1997 .
[56] Edgardo Ugalde,et al. ON THE PRESERVATION OF GIBBSIANNESS UNDER SYMBOL AMALGAMATION , 2009, 0907.0528.
[57] E. Seneta. Non-negative Matrices and Markov Chains (Springer Series in Statistics) , 1981 .
[58] Vladimir B. Balakirsky,et al. On the entropy rate of a hidden Markov model , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..
[59] John J. Birch. Approximations for the Entropy for Functions of Markov Chains , 1962 .
[60] V. Sharma,et al. Entropy and channel capacity in the regenerative setup with applications to Markov channels , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).
[61] Laurent Mevel,et al. Exponential Forgetting and Geometric Ergodicity in Hidden Markov Models , 2000, Math. Control. Signals Syst..
[62] Paul H. Siegel,et al. On the capacity of finite state channels and the analysis of convolutional accumulate-m codes , 2003 .