Editorial: Special issue on adaptive Monte Carlo methods

Monte Carlo methods have proved indispensible for many modern statistical applications—particularly within Bayesian statistics. When analysing complex models, the intractability of likelihoods and posterior distributions requires the use of numerical approximations, and, particularly in high dimensions, Monte Carlo approximation is often the approach of choice. The most commonly used Monte Carlo method is Markov chain Monte Carlo (MCMC) and reversible jump MCMC (Green 1995), but important alternatives include sequential Monte Carlo (Doucet et al. 2000), population Monte Carlo (Cappe et al. 2004) and Importance Sampling (see Fearnhead 2008, for more details of alternatives to MCMC). In all cases, the efficiency of a given Monte Carlo method will depend on how it is implemented. In many cases, theoretical aspects of the methods are well understood, and can be used to guide implementation. For example much is known about optimal acceptance rates for various MCMC algorithms (Roberts and Rosenthal 2001)—however using such information to tune an algorithm by hand can be very time-consuming. This special issues focusses on a class of Monte Carlo methods that “tune themselves”. The algorithms adapt as they are run, and hence are called adaptive Monte Carlo methods. The papers in this issue review such methods, suggest new adaptive Monte Carlo algorithms, and include case-studies demonstrating their efficiency. One common theme is the simplicity of implementing many adaptive Monte Carlo methods, and one hope of this special issue is that it will help encourage their use.

[1]  O. Cappé,et al.  Population Monte Carlo , 2004 .

[2]  George Y. Sofronov,et al.  Adaptive independence samplers , 2008, Stat. Comput..

[3]  Paul Fearnhead,et al.  Computational methods for complex stochastic systems: a review of some alternatives to MCMC , 2008, Stat. Comput..

[4]  Faming Liang,et al.  Adaptive evolutionary Monte Carlo algorithm for optimization with applications to sensor placement problems , 2008, Stat. Comput..

[5]  Cajo J. F. ter Braak,et al.  Differential Evolution Markov Chain with snooker updater and fewer chains , 2008, Stat. Comput..

[6]  Jean-Michel Marin,et al.  Adaptive importance sampling in general mixture classes , 2007, Stat. Comput..

[7]  Renate Meyer,et al.  Metropolis–Hastings algorithms with adaptive proposals , 2008, Stat. Comput..

[8]  Eric Moulines,et al.  Adaptive methods for sequential importance sampling with application to state space models , 2008, 2008 16th European Signal Processing Conference.

[9]  Christophe Andrieu,et al.  A tutorial on adaptive MCMC , 2008, Stat. Comput..

[10]  J. Rosenthal,et al.  Optimal scaling for various Metropolis-Hastings algorithms , 2001 .

[11]  Cathy W. S. Chen,et al.  Bayesian inference and model comparison for asymmetric smooth transition heteroskedastic models , 2008, Stat. Comput..

[12]  P. Green Reversible jump Markov chain Monte Carlo computation and Bayesian model determination , 1995 .

[13]  Simon J. Godsill,et al.  On sequential Monte Carlo sampling methods for Bayesian filtering , 2000, Stat. Comput..