Abstract When two or more information sources (“experts”) provide a decision maker with information on two or more random variables, the decision maker using Bayes's rule has an opportunity to (a) update a prior about the random variables and (b) calibrate the experts. (Calibration is the process of adjusting the decision maker's likelihood about the experts' assessments.) This article presents a model for this two-way process and specializes to the case in which the experts' assessment errors have a multivariate normal density. In general, we find that variables which the decision maker and the experts regard as independent a priori will be dependent a posteriori because of dependence in the assessment errors. Formulas for posterior densities are given for the normal model. In this model the posterior density of the random variables depends on only a weighted average of the expert's means, with weights that depend on the experts' assessments of previously known quantities. I also present a special case o...
[1]
A. Zellner.
An Introduction to Bayesian Inference in Econometrics
,
1971
.
[2]
S. J. Press,et al.
Applied Multivariate Analysis.
,
1973
.
[3]
Peter A. Morris,et al.
Combining Expert Judgments: A Bayesian Approach
,
1977
.
[4]
J. Michael Harrison.
Independence and Calibration in Decision Analysis
,
1977
.
[5]
A. Tversky,et al.
On the Reconciliation of Probability Assessments
,
1979
.
[6]
R. L. Winkler.
Combining Probability Distributions from Dependent Information Sources
,
1981
.
[7]
A. Dawid.
Some matrix-variate distribution theory: Notational considerations and a Bayesian application
,
1981
.
[8]
A. Dawid.
The Well-Calibrated Bayesian
,
1982
.
[9]
David Lindley.
The Improvement of Probability Judgements
,
1982
.