An Empirical Study of w-Cutset Sampling for Bayesian Networks

The paper studies empirically the time-space trade-off between sampling and inference in the cutser sampling algorithm. The algorithm samples over a subset of nodes in a Bayesian network and applies exact inference over the rest. As the size of the sampling space decreases, requiring less samples for convergence, the time for generating each single sample increases. Algorithm w-cutset sampling selects a sampling set such that the induced-width of the network when the sampling set is observed is bounded by w, thus requiring inference whose complexity is exponentially bounded by w. In this paper, we investigate the performance of w-cutset sampling as a function of w. Our experiments over a range of randomly generated and real benchmarks, demonstrate the power of the cutset sampling idea and in particular show that an optimal balance between inference and sampling benefits substantially from restricting the cutset size, even at the cost of more complex inference.

[1]  P. J. Green,et al.  Probability and Statistical Inference , 1978 .

[2]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[3]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[4]  Uffe Kjærulff,et al.  Blocking Gibbs sampling in very large probabilistic expert systems , 1995, Int. J. Hum. Comput. Stud..

[5]  Walter L. Smith Probability and Statistics , 1959, Nature.

[6]  David J. Spiegelhalter,et al.  Local computations with probabilities on graphical structures and their application to expert systems , 1990 .

[7]  Nando de Freitas,et al.  Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks , 2000, UAI.

[8]  Gregory M. Provan,et al.  Knowledge Engineering for Large Belief Networks , 1994, UAI.

[9]  Rina Dechter,et al.  Cycle-Cutset Sampling for Bayesian Networks , 2003, Canadian Conference on AI.

[10]  Uffe Kjærulff HUGS: Combining Exact Inference and Gibbs Sampling in junction Trees , 1995, UAI.

[11]  M. Kendall Probability and Statistical Inference , 1956, Nature.

[12]  Dragomir Anguelov,et al.  A General Algorithm for Approximate Inference and Its Application to Hybrid Bayes Nets , 1999, UAI.

[13]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems , 1988 .

[14]  Steffen L. Lauritzen,et al.  Bayesian updating in causal probabilistic networks by local computations , 1990 .

[15]  Rina Dechter,et al.  Bucket Elimination: A Unifying Framework for Reasoning , 1999, Artif. Intell..

[16]  Peter Green,et al.  Markov chain Monte Carlo in Practice , 1996 .

[17]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Jun S. Liu,et al.  Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes , 1994 .

[19]  Adrian F. M. Smith,et al.  Sampling-Based Approaches to Calculating Marginal Densities , 1990 .

[20]  G. Casella,et al.  Rao-Blackwellisation of sampling schemes , 1996 .

[21]  M. Degroot,et al.  Probability and Statistics , 2021, Examining an Operational Approach to Teaching Probability.

[22]  Reuven Bar-Yehuda,et al.  Random Algorithms for the Loop Cutset Problem , 1999, UAI.

[23]  Randolph A. Miller,et al.  Using Causal Knowledge to Create Simulated Patient Cases: The CPCS Project as an Extension of INTERNIST-1 , 1988 .