Advanced Inference in Bayesian Networks

The previous chapter introduced inference in discrete variable Bayesian networks. This used evidence propagation on the junction tree to find marginal distributions of interest. This chapter presents a tutorial introduction to some of the various types of calculations which can also be performed with the junction tree, specifically: Sampling. Most likely configurations. Fast retraction. Gaussian and conditional Gaussian models.

[1]  Robert G. Cowell,et al.  Sampling without replacement in junction trees , 1997 .

[2]  Uffe Kjærulff,et al.  A Computational Scheme for Reasoning in Dynamic Probabilistic Networks , 1992, UAI.

[3]  A. P. Dawid,et al.  Fast retraction of evidence in a probabilistic expert system , 1992 .

[4]  S. Lauritzen Propagation of Probabilities, Means, and Variances in Mixed Graphical Association Models , 1992 .

[5]  A. P. Dawid,et al.  Applications of a general propagation algorithm for probabilistic expert systems , 1992 .

[6]  D. Nilsson,et al.  An efficient algorithm for finding the M most probable configurationsin probabilistic expert systems , 1998, Stat. Comput..

[7]  Max Henrion,et al.  Propagating uncertainty in bayesian networks by probabilistic logic sampling , 1986, UAI.

[8]  Upendra Dave,et al.  Probabilistic Reasoning and Bayesian Belief Networks , 1996 .

[9]  Ross D. Shachter,et al.  Simulation Approaches to General Probabilistic Inference on Belief Networks , 2013, UAI.

[10]  Andrew P. Sage,et al.  Uncertainty in Artificial Intelligence , 1987, IEEE Transactions on Systems, Man, and Cybernetics.

[11]  C. Robert Kenley,et al.  Gaussian influence diagrams , 1989 .

[12]  Ross D. Shachter,et al.  Global Conditioning for Probabilistic Inference in Belief Networks , 1994, UAI.

[13]  Frank Jensen,et al.  From Influence Diagrams to junction Trees , 1994, UAI.

[14]  Wray L. Buntine Operations for Learning with Graphical Models , 1994, J. Artif. Intell. Res..