Inference in Hybrid Bayesian Networks Using Mixtures of Gaussians

The main goal of this paper is to describe a method for exact inference in general hybrid Bayesian networks (BNs) (with a mixture of discrete and continuous chance variables). Our method consists of approximating general hybrid Bayesian networks by a mixture of Gaussians (MoG) BNs. There exists a fast algorithm by Lauritzen-Jensen (LJ) for making exact inferences in MoG Bayesian networks, and there exists a commercial implementation of this algorithm. However, this algorithm can only be used for MoG BNs. Some limitations of such networks are as follows. All continuous chance variables must have conditional linear Gaussian distributions, and discrete chance nodes cannot have continuous parents. The methods described in this paper will enable us to use the LJ algorithm for a bigger class of hybrid Bayesian networks. This includes networks with continuous chance nodes with non-Gaussian distributions, networks with no restrictions on the topology of discrete and continuous variables, networks with conditionally deterministic variables that are a nonlinear function of their continuous parents, and networks with continuous chance variables whose variances are functions of their parents.

[1]  A. F. Smith,et al.  Statistical analysis of finite mixture distributions , 1986 .

[2]  Ross D. Shachter,et al.  Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties , 1993, UAI.

[3]  Prakash P. Shenoy,et al.  Inference in hybrid Bayesian networks with mixtures of truncated exponentials , 2006, Int. J. Approx. Reason..

[4]  Ross D. Shachter Evaluating Influence Diagrams , 1986, Oper. Res..

[5]  Steffen L. Lauritzen,et al.  Stable local computation with conditional Gaussian distributions , 2001, Stat. Comput..

[6]  Prakash P. Shenoy,et al.  Operations for inference in continuous Bayesian networks with linear deterministic variables , 2006, Int. J. Approx. Reason..

[7]  Uri Lerner,et al.  Inference in Hybrid Networks: Theoretical Limits and Practical Algorithms , 2001, UAI.

[8]  Serafín Moral,et al.  Mixtures of Truncated Exponentials in Hybrid Bayesian Networks , 2001, ECSQARU.

[9]  Scott M. Olmsted On representing and solving decision problems , 1983 .

[10]  Uri Lerner,et al.  Exact Inference in Networks with Discrete Children of Continuous Parents , 2001, UAI.

[11]  Kevin P. Murphy,et al.  A Variational Approximation for Bayesian Networks with Discrete and Continuous Latent Variables , 1999, UAI.

[12]  Zhi Tian,et al.  Efficient inference for mixed Bayesian networks , 2002, Proceedings of the Fifth International Conference on Information Fusion. FUSION 2002. (IEEE Cat.No.02EX5997).

[13]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[14]  Vibhav Gogate,et al.  Approximate Inference Algorithms for Hybrid Bayesian Networks with Discrete Constraints , 2005, UAI.

[15]  Huaiyu Zhu On Information and Sufficiency , 1997 .

[16]  S. Lauritzen Propagation of Probabilities, Means, and Variances in Mixed Graphical Association Models , 1992 .

[17]  Daphne Koller,et al.  Nonuniform Dynamic Discretization in Hybrid Networks , 1997, UAI.

[18]  Prakash P. Shenoy,et al.  Hybrid Bayesian Networks with Linear Deterministic Variables , 2005, UAI.

[19]  Dragomir Anguelov,et al.  A General Algorithm for Approximate Inference and Its Application to Hybrid Bayes Nets , 1999, UAI.

[20]  Prakash P. Shenoy,et al.  Approximating Probability Density Functions with Mixtures of Truncated Exponentials , 2004 .

[21]  Prakash P. Shenoy,et al.  Nonlinear Deterministic Relationships in Bayesian Networks , 2005, ECSQARU.