Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models

This paper reviews the development of discrete mixture distributions as approximations to priors and posteriors in Bayesian analysis focusing on the development of simulation based techniques in sequential updating and the analysis of dynamic models Adaptive density estimation techniques enable construction of mixtures of elliptical distributions useful as direct approximations and as importance sampling functions Illustrations in sequential modelling are discussed Adaptive mixture modelling Importance sampling and mixtures With the recent mushrooming application of simulation based methods of numerical analysis Bayesian analyses often involves the approximation of continuous prior and posterior distributions using discrete sets of points and associated weights based on a Monte Carlo approximation West a introduced an adaptive importance sampling scheme to develop such discrete approximations and methods to provide smooth reconstructions in cases when the prior and likelihood functions separately or the posterior directly may be at least approximately evaluated up to irrelevant constants Suppose p is the continuous posterior density function for a continuous parameter vector An approximating density g is used as an importance sampling function Geweke as follows Let f j j ng be a random sample from g and de ne weights fwj j ng by wj p j kg j for each j where k Pn j p j g j The weights are evaluated via wj p j g j and then normalised to unit sum Then inference under p is approximated using the discrete distribution having masses wj at j for each j n The conditions on g required to achieve reasonable approximations typically reduce to requiring g have the same support as p and that the tails of g be heavier than those of p so that variations on multivariate T distributions have become popular Geweke In West a mixtures of T distributions are proposed one additional reason being that mixtures have the exibility to represent the possibly quite complex and varied forms of posterior densities This is done using kernel density estimation techniques With an importance sampling function g close to the true density p kernel density estimation or other smoothing techniques provides continuous estimates of joint and marginal densities In simple univariate random sampling problems kernel type density estimates have direct Bayesian interpretation as approximate predictive distributions in models based on mixtures of Dirichlet processes West multivariate analogues are similarly derivable Erkanli M uller and West West a uses weighted variations on multivariate kernel estimates as importance sampling functions and with some modi cation to more directly estimate marginal densities of p The basic idea is as follows Given a chosen importance sampling density g the sample of size n and associated weights the exact density p may be approximated by a weighted kernel estimate of the form