A Bayesian belief network is a model of a joint distribution over a finite set of variables, with a DAG structure representing immediate dependencies among the variables. For each node, a table of parameters (CP-table) represents local conditional probabilities, with rows indexed by conditioning events (assignments to parents). CP-table rows are usually modeled as independent random vectors, each assigned a Dirichlet prior distribution. The assumption that rows are independent permits a relatively simple analysis but may not reflect actual prior opinion about the parameters. Rows representing similar conditioning events often have similar conditional probabilities. This paper introduces a more flexible family of "dependent Dirichlet" prior distributions, where rows are not necessarily independent. Simple methods are developed to approximate the Bayes estimators of CP-table parameters with optimal linear estimators; i.e., linear combinations of sample proportions and prior means. This approach yields more efficient estimators by sharing information among rows. Improvements in efficiency can be substantial when a CP-table has many rows and samples sizes are small.
[1]
W. R. Buckland,et al.
Distributions in Statistics: Continuous Multivariate Distributions
,
1973
.
[2]
David J. Spiegelhalter,et al.
Sequential updating of conditional probabilities on directed graphical structures
,
1990,
Networks.
[3]
J. Rao.
Small Area Estimation
,
2003
.
[4]
Stephen E. Fienberg,et al.
Discrete Multivariate Analysis: Theory and Practice
,
1976
.
[5]
Russell Greiner,et al.
Bayesian Error-Bars for Belief Net Inference
,
2001,
UAI.
[6]
David Madigan,et al.
Relaxing the local independence assumption for quantitative learning in acyclic directed graphical models through hierarchical partition models
,
1999,
AISTATS.