Variational Learning in Mixed-State Dynamic Graphical Models

Many real-valued stochastic time-series are locally linear (Gassian), but globally non-linear. For example, the trajectory of a human hand gesture can be viewed as a linear dynamic system driven by a nonlinear dynamic system that represents muscle actions. We present a mixed-state dynamic graphical model in which a hidden Markov model drives a linear dynamic system. This combination allows us to model both the discrete and continuous causes of trajectories such as human gestures. The number of computations needed for exact inference is exponential in the sequence length, so we derive an approximate variational inference technique that can also be used to learn the parameters of the discrete and continuous models. We show how the mixed-state model and the variational technique can be used to classify human hand gestures made with a computer mouse.

[1]  Biing-Hwang Juang,et al.  Fundamentals of speech recognition , 1993, Prentice Hall signal processing series.

[2]  Geoffrey E. Hinton,et al.  Switching State-Space Models , 1996 .

[3]  Vladimir Pavlovic,et al.  Dynamic bayesian networks for information fusion with applications to human-computer interfaces , 1999 .

[4]  R. E. Kalman,et al.  New Results in Linear Filtering and Prediction Theory , 1961 .

[5]  Josef Kittler,et al.  Pattern recognition : a statistical approach , 1982 .

[6]  Radford M. Neal Connectionist Learning of Belief Networks , 1992, Artif. Intell..

[7]  Michael I. Jordan,et al.  Exploiting Tractable Substructures in Intractable Networks , 1995, NIPS.

[8]  Alex Pentland,et al.  Coupled hidden Markov models for complex action recognition , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[9]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[10]  R. Hathaway Another interpretation of the EM algorithm for mixture distributions , 1986 .

[11]  H. Rauch Solutions to the linear smoothing problem , 1963 .

[12]  Y. Bar-Shalom Tracking and data association , 1988 .

[13]  Geoffrey E. Hinton,et al.  The Helmholtz Machine , 1995, Neural Computation.

[14]  Zoubin Ghahramani,et al.  A Unifying Review of Linear Gaussian Models , 1999, Neural Computation.

[15]  Geoffrey E. Hinton,et al.  The "wake-sleep" algorithm for unsupervised neural networks. , 1995, Science.

[16]  Radford M. Neal A new view of the EM algorithm that justifies incremental and other variants , 1993 .

[17]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[18]  Brendan J. Frey,et al.  Graphical Models for Machine Learning and Digital Communication , 1998 .

[19]  Zoubin Ghahramani,et al.  Learning Dynamic Bayesian Networks , 1997, Summer School on Neural Networks.

[20]  Xavier Boyen,et al.  Tractable Inference for Complex Stochastic Processes , 1998, UAI.

[21]  Daphne Koller,et al.  Using Learning for Approximation in Stochastic Processes , 1998, ICML.

[22]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems , 1988 .

[23]  Y. Bar-Shalom,et al.  The interacting multiple model algorithm for systems with Markovian switching coefficients , 1988 .

[24]  William H. Offenhauser,et al.  Wild Boars as Hosts of Human-Pathogenic Anaplasma phagocytophilum Variants , 2012, Emerging infectious diseases.

[25]  Michael A. West,et al.  Bayesian Forecasting and Dynamic Models (2nd edn) , 1997, J. Oper. Res. Soc..