On the use of local optimizations within Metropolis–Hastings updates

Summary.  We propose new Metropolis–Hastings algorithms for sampling from multimodal dis‐ tributions on ℜn. Tjelmeland and Hegstad have obtained direct mode jumping proposals by optimization within Metropolis–Hastings updates and different proposals for ‘forward’ and ‘backward’ steps. We generalize their scheme by allowing the probability distribution for forward and backward kernels to depend on the current state. We use the new setting to combine mode jumping proposals and proposals from a prior approximation. We obtain that the frequency of proposals from the different proposal kernels is automatically adjusted to their quality. Mode jumping proposals include local optimizations. When combining this with a prior approximation it is tempting to use local optimization results not only for mode jumping proposals but also to improve the prior approximation. We show how this idea can be implemented. The resulting algorithm is adaptive but has a Markov structure. We evaluate the effectiveness of the proposed algorithms in two simulation examples.

[1]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[2]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[3]  P. Peskun,et al.  Optimum Monte-Carlo sampling using Markov chains , 1973 .

[4]  Adrian F. M. Smith,et al.  Bayesian computation via the gibbs sampler and related markov chain monte carlo methods (with discus , 1993 .

[5]  D. Rubin,et al.  Inference from Iterative Simulation Using Multiple Sequences , 1992 .

[6]  Y. Bechtel,et al.  A population and family study N‐acetyltransferase using caffeine urinary metabolites , 1993, Clinical pharmacology and therapeutics.

[7]  Walter R. Gilks,et al.  Adaptive Direction Sampling , 1994 .

[8]  S. Chib,et al.  Bayes inference in regression models with ARMA (p, q) errors , 1994 .

[9]  A. Gelfand,et al.  On Markov Chain Monte Carlo Acceleration , 1994 .

[10]  R. Tweedie,et al.  Rates of convergence of the Hastings and Metropolis algorithms , 1996 .

[11]  Peter Green,et al.  Markov chain Monte Carlo in Practice , 1996 .

[12]  Radford M. Neal Sampling from multimodal distributions using tempered transitions , 1996, Stat. Comput..

[13]  P. Green,et al.  Corrigendum: On Bayesian analysis of mixtures with an unknown number of components , 1997 .

[14]  P. Green,et al.  On Bayesian Analysis of Mixtures with an Unknown Number of Components (with discussion) , 1997 .

[15]  H. Kanekiyo Population and Family , 1997 .

[16]  Lars Holden,et al.  Adaptive Chains , 1998 .

[17]  J. Rosenthal,et al.  Optimal scaling of discrete approximations to Langevin diffusions , 1998 .

[18]  G. Roberts,et al.  Adaptive Markov Chain Monte Carlo through Regeneration , 1998 .

[19]  L Tierney,et al.  Some adaptive monte carlo methods for Bayesian inference. , 1999, Statistics in medicine.

[20]  Jun S. Liu,et al.  The Multiple-Try Method and Local Optimization in Metropolis Sampling , 2000 .

[21]  C. Robert,et al.  Controlled MCMC for Optimal Sampling , 2001 .

[22]  H. Tjelmeland,et al.  Mode Jumping Proposals in MCMC , 2001 .

[23]  H. Haario,et al.  An adaptive Metropolis algorithm , 2001 .

[24]  H. Rue,et al.  On Block Updating in Markov Random Field Models for Disease Mapping , 2002 .

[25]  Anatoly Zhigljavsky,et al.  Self-regenerative Markov chain Monte Carlo with adaptation , 2003 .