A New Algorithm for Learning Non-Stationary Dynamic Bayesian Networks With Application to Event Detection

Dynamic Bayesian networks (DBN) are a popular framework for managing uncertainty in time-evolving systems. Their efficient learning has thus received many contributions in the literature. But, most often, those assume that the data to be modeled by a DBN are generated by a stationary process, i.e., neither the structure nor the parameters of the BNs evolve over time. Unfortunately, there exist real-world problems where such a hypothesis is highly unrealistic, e.g., in video event recognition, social networks or road traffic analysis. In this paper, we propose a principled approach to learn the structure and parameters of "non-stationary DBNs", that can cope with such situations. Our algorithm is specifically designed to work in situations where all input data are streamed. Unlike previous works on non-stationary DBN learning, we make no restrictive assumption about the way the structure evolves or over parameters' independence during this evolution. Yet, as highlighted in our experimentations, our algorithm scales very well. Its lack of restrictive assumptions makes it very effective to detect events (evolutions), which is confirmed by our experimentations.

[1]  Dirk Husmeier,et al.  Heterogeneous Continuous Dynamic Bayesian Networks with Flexible Structure and Inter-Time Segment Information Sharing , 2010, ICML.

[2]  Ann E. Nicholson,et al.  Using Mutual Information to Determine Relevance in Bayesian Networks , 1998, PRICAI.

[3]  Marco Grzegorczyk,et al.  Non-stationary continuous dynamic Bayesian networks , 2009, NIPS.

[4]  David Maxwell Chickering,et al.  Learning Bayesian Networks: The Combination of Knowledge and Statistical Data , 1994, Machine Learning.

[5]  Alexander J. Hartemink,et al.  Learning Non-Stationary Dynamic Bayesian Networks , 2010, J. Mach. Learn. Res..

[6]  Fabio Gagliardi Cozman,et al.  Random Generation of Bayesian Networks , 2002, SBIA.

[7]  Søren Holbech Nielsen,et al.  Adapting Bayes Nets to Non-stationary Probability Distributions , 2006 .

[8]  Alexander J. Hartemink,et al.  Non-stationary dynamic Bayesian networks , 2008, NIPS.

[9]  David Maxwell Chickering,et al.  Learning Equivalence Classes of Bayesian Network Structures , 1996, UAI.

[10]  Ben Taskar,et al.  Introduction to statistical relational learning , 2007 .

[11]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[12]  Gregory F. Cooper,et al.  A Bayesian Method for the Induction of Probabilistic Networks from Data , 1992 .

[13]  Nir Friedman,et al.  Sequential Update of Bayesian Network Structure , 1997, UAI.

[14]  Constantin F. Aliferis,et al.  The max-min hill-climbing Bayesian network structure learning algorithm , 2006, Machine Learning.

[15]  Marco Grzegorczyk,et al.  Non-homogeneous dynamic Bayesian networks for continuous data , 2011, Machine Learning.

[16]  I. Ebert‐Uphoff Measuring Connection Strengths and Link Strengths in Discrete Bayesian Networks , 2007 .

[17]  David Heckerman,et al.  A Tutorial on Learning with Bayesian Networks , 1998, Learning in Graphical Models.

[18]  David Heckerman,et al.  A Characterization of the Dirichlet Distribution with Application to Learning Bayesian Networks , 1995, UAI.

[19]  Keiji Kanazawa,et al.  A model for reasoning about persistence and causation , 1989 .