Asymptotic properties of the MLE in hidden Markov models

We consider an hidden Markov model (HMM) with multidimensional observations, and where the coefficients (transition probability matrix, and observation conditional densities) depend on some unknown parameter. We investigate the asymptotic behaviour of the maximum likelihood estimator (MLE), as the number of observations increases to infinity. We exhibit the associated Kullback-Leibler information, we show that the MLE is consistent, i.e. converges to the set of minima of the Kullback-Leibler information. Finally, we prove that the MLE is asymptotically normal, under standard assumptions.