Recycling intermediate steps to improve Hamiltonian Monte Carlo

Hamiltonian Monte Carlo (HMC) and related algorithms have become routinely used in Bayesian computation. In this article, we present a simple and provably accurate method to improve the efficiency of HMC and related algorithms with essentially no extra computational cost. This is achieved by {recycling the intermediate states along simulated trajectories of Hamiltonian dynamics. Standard algorithms use only the end points of trajectories, wastefully discarding all the intermediate states. Compared to the alternative methods for utilizing the intermediate states, our algorithm is simpler to apply in practice and requires little programming effort beyond the usual implementations of HMC and related algorithms. Our algorithm applies straightforwardly to the no-U-turn sampler, arguably the most popular variant of HMC. Through a variety of experiments, we demonstrate that our recycling algorithm yields substantial computational efficiency gains.

[1]  Andrew Gelman,et al.  The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo , 2011, J. Mach. Learn. Res..

[2]  John K. Kruschke,et al.  Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan , 2014 .

[3]  Nando de Freitas,et al.  Adaptive Hamiltonian and Riemann manifold Monte Carlo samplers , 2013, ICML 2013.

[4]  M. Betancourt Generalizing the No-U-Turn Sampler to Riemannian Manifolds , 2013, 1304.1920.

[5]  S. Duane,et al.  Hybrid Monte Carlo , 1987 .

[6]  Jun S. Liu,et al.  Locally weighted Markov chain Monte Carlo , 2015, 1506.08852.

[7]  Radford M. Neal MCMC Using Hamiltonian Dynamics , 2011, 1206.1901.

[8]  James T. Thorson,et al.  Faster estimation of Bayesian models in ecology using Hamiltonian Monte Carlo , 2017 .

[9]  David B. Dunson,et al.  Bayesian Data Analysis , 2010 .

[10]  Christophe Andrieu,et al.  A tutorial on adaptive MCMC , 2008, Stat. Comput..

[11]  Babak Shahbaba,et al.  Spherical Hamiltonian Monte Carlo for Constrained Target Distributions , 2013, ICML.

[12]  Radford M. Neal An improved acceptance procedure for the hybrid Monte Carlo algorithm , 1992, hep-lat/9208011.

[13]  J. M. Sanz-Serna,et al.  Compressible generalized hybrid Monte Carlo. , 2014, The Journal of chemical physics.

[14]  Babak Shahbaba,et al.  Split Hamiltonian Monte Carlo , 2011, Stat. Comput..

[15]  Yee Whye Teh,et al.  Relativistic Monte Carlo , 2016, AISTATS.

[16]  M. Girolami,et al.  Riemann manifold Langevin and Hamiltonian Monte Carlo methods , 2011, Journal of the Royal Statistical Society: Series B (Statistical Methodology).

[17]  John Salvatier,et al.  Probabilistic programming in Python using PyMC3 , 2016, PeerJ Comput. Sci..

[18]  D. Hastie,et al.  Model choice using reversible jump Markov chain Monte Carlo , 2012 .

[19]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[20]  L. Carin,et al.  Monomial Gamma Monte Carlo Sampling , 2016 .

[21]  J. M. Sanz-Serna,et al.  Optimal tuning of the hybrid Monte Carlo algorithm , 2010, 1001.4460.

[22]  Liam Paninski,et al.  Auxiliary-variable Exact Hamiltonian Monte Carlo Samplers for Binary Distributions , 2013, NIPS.

[23]  Ben Calderhead,et al.  A general construction for parallelizing Metropolis−Hastings algorithms , 2014, Proceedings of the National Academy of Sciences.

[24]  Andrew Gelman,et al.  Handbook of Markov Chain Monte Carlo , 2011 .

[25]  Xiangyu Wang,et al.  Towards Unifying Hamiltonian Monte Carlo and Slice Sampling , 2016, NIPS.

[26]  A. Kennedy,et al.  Acceptances and autocorrelations in hybrid Monte Carlo , 1991 .

[27]  H. Tjelmeland,et al.  Using all Metropolis-Hastings proposals to estimate mean values , 2004 .

[28]  D. Frenkel Speed-up of Monte Carlo simulations by sampling of rejected states. , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[29]  M. Betancourt,et al.  On the geometric ergodicity of Hamiltonian Monte Carlo , 2016, Bernoulli.

[30]  O. Kallenberg Foundations of Modern Probability , 2021, Probability Theory and Stochastic Modelling.