On the geometric convergence for MALA under verifiable conditions

While the Metropolis Adjusted Langevin Algorithm (MALA) is a popular and widely used Markov chain Monte Carlo method, very few papers derive conditions that ensure its convergence. In particular, to the authors’ knowledge, assumptions that are both easy to verify and guarantee geometric convergence, are still missing. In this work, we establish V -uniformly geometric convergence for MALA under mild assumptions about the target distribution. Unlike previous work, we only consider tail and smoothness conditions for the potential associated with the target distribution. These conditions are quite common in the MCMC literature and are easy to verify in practice. Finally, we pay special attention to the dependence of the bounds we derive on the step size of the Euler-Maruyama discretization, which corresponds to the proposal Markov kernel of MALA.

[1]  A. Eberle Error bounds for Metropolis–Hastings algorithms applied to perturbations of Gaussian measures in high dimensions , 2012, 1210.1180.

[2]  Sinho Chewi,et al.  Optimal dimension dependence of the Metropolis-Adjusted Langevin Algorithm , 2020, COLT.

[3]  M. Ledoux,et al.  Analysis and Geometry of Markov Diffusion Operators , 2013 .

[4]  Miklós Simonovits,et al.  Isoperimetric problems for convex bodies and a localization lemma , 1995, Discret. Comput. Geom..

[5]  J. Rosenthal,et al.  Optimal scaling of discrete approximations to Langevin diffusions , 1998 .

[6]  Michael I. Miller,et al.  REPRESENTATIONS OF KNOWLEDGE IN COMPLEX SYSTEMS , 1994 .

[7]  D. Vere-Jones Markov Chains , 1972, Nature.

[8]  R. Sarpong,et al.  Bio-inspired synthesis of xishacorenes A, B, and C, and a new congener from fuscol† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c9sc02572c , 2019, Chemical science.

[9]  J. D. Doll,et al.  Brownian dynamics as smart Monte Carlo simulation , 1978 .

[10]  Jonathan C. Mattingly,et al.  Ergodicity for SDEs and approximations: locally Lipschitz vector fields and degenerate noise , 2002 .

[11]  Eric Moulines,et al.  Diffusion Approximations and Control Variates for MCMC , 2018, Computational Mathematics and Mathematical Physics.

[12]  Radford M. Neal Bayesian Learning via Stochastic Dynamics , 1992, NIPS.

[13]  Feng-Yu Wang,et al.  Estimation of spectral gap for elliptic operators , 1997 .

[14]  Santosh S. Vempala,et al.  The geometry of logconcave functions and sampling algorithms , 2007, Random Struct. Algorithms.

[15]  É. Moulines,et al.  On the convergence of Hamiltonian Monte Carlo , 2017, 1705.00166.

[16]  P. Massart,et al.  Adaptive estimation of a quadratic functional by model selection , 2000 .

[17]  D. Ermak A computer simulation of charged particles in solution. I. Technique and equilibrium properties , 1975 .

[18]  E. Vanden-Eijnden,et al.  Non-asymptotic mixing of the MALA algorithm , 2010, 1008.3514.

[19]  A. Eberle Couplings, distances and contractivity for diffusion processes revisited , 2013 .

[20]  D. Ermak A computer simulation of charged particles in solution. II. Polyion diffusion coefficient , 1975 .

[21]  Mateusz B. Majka,et al.  Quantitative contraction rates for Markov chains on general state spaces , 2018, Electronic Journal of Probability.

[22]  Alain Durmus,et al.  Convergence of diffusions and their discretizations: from continuous to discrete processes and back , 2019, 1904.09808.

[23]  R. Tweedie,et al.  Exponential convergence of Langevin distributions and their discrete approximations , 1996 .

[24]  Alain Durmus,et al.  Uniform minorization condition and convergence bounds for discretizations of kinetic Langevin dynamics , 2021, ArXiv.

[25]  Martin J. Wainwright,et al.  Log-concave sampling: Metropolis-Hastings algorithms are fast! , 2018, COLT.

[26]  G. Stoltz,et al.  Error analysis of the transport properties of Metropolized schemes , 2014, 1402.6537.

[27]  Gabriel Stoltz,et al.  Partial differential equations and stochastic methods in molecular dynamics* , 2016, Acta Numerica.