On-line estimation of hidden Markov model parameters based on the Kullback-Leibler information measure

Sequential or online hidden Markov model (HMM) signal processing schemes are derived, and their performance is illustrated by simulation. The online algorithms are sequential expectation maximization (EM) schemes and are derived by using stochastic approximations to maximize the Kullback-Leibler information measure. The schemes can be implemented either as filters or fixed-lag or sawtooth-lag smoothers. They yield estimates of the HMM parameters including transition probabilities, Markov state levels, and noise variance. In contrast to the offline EM algorithm (Baum-Welch scheme), which uses the fixed-interval forward-backward scheme, the online schemes have significantly reduced memory requirements and improved convergence, and they can estimate HMM parameters that vary slowly with time or undergo infrequent jump changes. Similar techniques are used to derive online schemes for extracting finite-state Markov chains imbedded in a mixture of white Gaussian noise (WGN) and deterministic signals of known functional form with unknown parameters. >

[1]  Brian D. O. Anderson,et al.  A nonlinear fixed-lag smoother for finite-state Markov processes , 1975, IEEE Trans. Inf. Theory.

[2]  V Krishnamurthy,et al.  Adaptive processing techniques based on hidden Markov models for characterizing very small channel currents buried in noise and deterministic interferences. , 1991, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[3]  Petre Stoica,et al.  Decentralized Control , 2018, The Control Systems Handbook.

[4]  John B. Moore,et al.  Discrete-time fixed-lag smoothing algorithms , 1973 .

[5]  Iain B. Collings,et al.  Recursive Prediction Error Techniques for Adaptive Estimation of Hidden Markov Models , 1993 .

[6]  J. Behboodian Information matrix for a mixture of two normal distributions , 1972 .

[7]  Shahid U. H. Qureshi,et al.  Reduced-state sequence estimation with set partitioning and decision feedback , 1988, IEEE Trans. Commun..

[8]  L. Baum,et al.  Statistical Inference for Probabilistic Functions of Finite State Markov Chains , 1966 .

[9]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[10]  L. Baum,et al.  A Maximization Technique Occurring in the Statistical Analysis of Probabilistic Functions of Markov Chains , 1970 .

[11]  L. Baum,et al.  An inequality and associated maximization technique in statistical estimation of probabilistic functions of a Markov process , 1972 .

[12]  Vikram Krishnamurthy,et al.  Hidden Markov Model Signal Processing in Presence , 1996 .

[13]  A. Hawkes,et al.  On the stochastic properties of single ion channels , 1981, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[14]  L. R. Rabiner,et al.  An introduction to the application of the theory of probabilistic functions of a Markov process to automatic speech recognition , 1983, The Bell System Technical Journal.

[15]  Ehud Weinstein,et al.  Sequential algorithms for parameter estimation based on the Kullback-Leibler information measure , 1990, IEEE Trans. Acoust. Speech Signal Process..

[16]  John B. Moore,et al.  On hidden fractal model signal processing , 1991, Signal Process..

[17]  W. Wonham Some applications of stochastic difierential equations to optimal nonlinear ltering , 1964 .

[18]  D. Titterington Recursive Parameter Estimation Using Incomplete Data , 1984 .

[19]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[20]  A. F. Smith,et al.  Statistical analysis of finite mixture distributions , 1986 .