Diagnosis using a first-order stochastic language that learns

We have created a diagnostic/prognostic software tool for the analysis of complex systems, such as monitoring the ''running health'' of helicopter rotor systems. Although our software is not yet deployed for real-time in-flight diagnosis, we have successfully analyzed the data sets of actual helicopter rotor failures supplied to us by the US Navy. In this paper, we discuss both critical techniques supporting the design of our stochastic diagnostic system as well as issues related to its full deployment. We also present four examples of its use. Our diagnostic system, called DBAYES, is composed of a logic-based, first-order, and Turing-complete set of software tools for stochastic modeling. We use this language for modeling time-series data supplied by sensors on mechanical systems. The inference scheme for these software tools is based on a variant of Pearl's loopy belief propagation algorithm [Pearl, P. (1988). Probabilistic reasoning in intelligent systems: Networks of plausible inference. San Francisco, CA: Morgan Kaufmann]. Our language contains variables that can capture general classes of situations, events, and relationships. A Turing-complete language is able to reason about potentially infinite classes and situations, similar to the analysis of dynamic Bayesian networks. Since the inference algorithm is based on a variant of loopy belief propagation, the language includes expectation maximization type learning of parameters in the modeled domain. In this paper we briefly present the theoretical foundations for our first-order stochastic language and then demonstrate time-series modeling and learning in the context of fault diagnosis.

[1]  B.-H. Juang,et al.  On the hidden Markov model and dynamic time warping for speech recognition — A unified view , 1984, AT&T Bell Laboratories Technical Journal.

[2]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[3]  James Cussens,et al.  Prolog Issues of an MCMC Algorithm , 2001, INAP.

[4]  Geoffrey E. Hinton,et al.  Recognizing Hand-written Digits Using Hierarchical Products of Experts , 2002, NIPS.

[5]  Stephen Muggleton Bayesian Inductive Logic Programming , 1994, ICML.

[6]  V. S. Subrahmanian,et al.  Probabilistic Logic Programming , 1992, Inf. Comput..

[7]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[8]  Luc De Raedt,et al.  Bayesian Logic Programs , 2001, ILP Work-in-progress reports.

[9]  Peter Haddawy,et al.  Answering Queries from Context-Sensitive Probabilistic Knowledge Bases , 1997, Theor. Comput. Sci..

[10]  Daphne Koller,et al.  Probabilistic Abstraction Hierarchies , 2001, NIPS.

[11]  George F. Luger,et al.  Toward General Analysis of Recursive Probability Models , 2001, UAI.

[12]  Eric Horvitz,et al.  Dynamic Network Models for Forecasting , 1992, UAI.

[13]  James Cussens,et al.  Parameter Estimation in Stochastic Logic Programs , 2001, Machine Learning.

[14]  Daniel Pless,et al.  Learning of Product Distributions in a First-Order Stochastic Logic Language , .

[15]  Lise Getoor,et al.  Learning Probabilistic Relational Models , 1999, IJCAI.

[16]  James Cussens,et al.  Markov Chain Monte Carlo using Tree-Based Priors on Model Structure , 2001, UAI.

[17]  Avi Pfeffer,et al.  Probabilistic Frame-Based Systems , 1998, AAAI/IAAI.