The HMM (hidden Markov model) is a probabilistic model of the joint probability of a collection of random variables with both observations and states. The GMM (Gaussian mixture model) is a finite mixture probability distribution model. Although the two models have a close relationship, they are always discussed independently and separately. The EM (expectation-maximum) algorithm is a general method to improve the descent algorithm for finding the maximum likelihood estimation. The EM of HMM and the EM of GMM have similar formulae. Two points are proposed in this paper. One is that the EM of GMM can be regarded as a special EM of HMM. The other is that the EM algorithm of GMM based on symbols is faster in implementation than the EM algorithm of GMM based on samples (or on observation) traditionally.
[1]
David G. Stork,et al.
Pattern Classification
,
1973
.
[2]
D. Rubin,et al.
Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper
,
1977
.
[3]
Biing-Hwang Juang,et al.
Fundamentals of speech recognition
,
1993,
Prentice Hall signal processing series.
[4]
Jeff A. Bilmes,et al.
A gentle tutorial of the em algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models
,
1998
.
[5]
Shigeo Abe DrEng.
Pattern Classification
,
2001,
Springer London.