This paper compares three methods | em algorithm, Gibbs sampling, and Bound and Collapse (bc) | to estimate conditional probabilities from incomplete databases in a controlled experiment. Results show a substantial equivalence of the estimates provided by the three methods and a dramatic gain in e ciency using bc. Reprinted from: Proceedings of Uncertainty 99: Seventh International Workshop on Arti cial Intelligence and Statistics, Morgan Kaufmann, San Mateo, CA, 1999. Address: Marco Ramoni, Knowledge Media Institute, The Open University, Milton Keynes, United Kingdom MK7 6AA. phone: +44 (1908) 655721, fax: +44 (1908) 653169, email: m.ramoni@open.ac.uk, url: http://kmi.open.ac.uk/people/marco. Learning Conditional Probabilities from Incomplete Data: An Experimental Comparison Marco Ramoni Knowledge Media Institute The Open University Paola Sebastiani Statistics Department The Open University Abstract This paper compares three methods | em algorithm, Gibbs sampling, and Bound and Collapse (bc) | to estimate conditional probabilities from incomplete databases in a controlled experiment. Results show a substantial equivalence of the estimates provided by the three methods and a dramatic gain in e ciency using bc.
[1]
D. Rubin,et al.
Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper
,
1977
.
[2]
Donald Geman,et al.
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
,
1984,
IEEE Transactions on Pattern Analysis and Machine Intelligence.
[3]
Wray L. Buntine.
Operations for Learning with Graphical Models
,
1994,
J. Artif. Intell. Res..
[4]
Bo Thiesson,et al.
Accelerated Quantification of Bayesian Networks with Incomplete Data
,
1995,
KDD.
[5]
David Heckerman,et al.
Bayesian Networks for Knowledge Discovery
,
1996,
Advances in Knowledge Discovery and Data Mining.
[6]
Paola Sebastiani,et al.
Parameter Estimation in Bayesian Networks from Incomplete Databases
,
1998,
Intell. Data Anal..