Bayesian computational methods

The bayesian (or integrated likelihood) approach to statistical modelling and analysis proceeds by representing all uncertainties in the form of probability distributions. Learning from new data is accomplished by application of Bayes’s Theorem, the latter providing a joint probability description of uncertainty for all model unknowns. To pass from this joint probability distribution to a collection of marginal summary inferences for specified interesting individual (or subsets of) unknowns, requires appropriate integration of the joint distribution. In all but simple stylized problems, these (typically high-dimensional) integrations will have to be performed numerically. This need for efficient simultaneous calculation of potentially many numerical integrals poses novel computational problems. Developments over the past decade are reviewed, including adaptive quadrature, adaptive Monte Carlo, and a variant of a Markov chain simulation procedure known as the Gibbs sampler.