Markov chain Monte Carlo methods: Theory and practice

Abstract In many situations, especially in Bayesian statistical analysis, it is required to draw samples from intractable probability distributions. A common way to obtain approximate samples from such distributions is to make use of Markov chain Monte Carlo (MCMC) algorithms. Two questions arise when using MCMC algorithms. The first of these is how long the underlying Markov chain must run before it can be used to draw approximate samples from the desired distribution. The second question is that of how often states of the chain can be used as approximate samples from the desired distribution so that the samples are roughly independent. This chapter provides insight into how to answer both of these questions in the course of describing how MCMC algorithms are used in practice. In this chapter, common types of MCMC algorithms are described, and Bayesian estimation using the output of the chain is also discussed.

[1]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[2]  Walter R. Gilks,et al.  MCMC for nonlinear hierarchical models , 1995 .

[3]  U. Dieter,et al.  Acceptance-Rejection Techniques for Sampling from The Gamma and Beta Distributions. , 1974 .

[4]  J. Rosenthal Minorization Conditions and Convergence Rates for Markov Chain Monte Carlo , 1995 .

[5]  David A. Spade A computational procedure for estimation of the mixing time of the random-scan Metropolis algorithm , 2016, Stat. Comput..

[6]  G. Fort,et al.  On the geometric ergodicity of hybrid samplers , 2003, Journal of Applied Probability.

[7]  Brian D. Ripley,et al.  Stochastic Simulation , 2005 .

[8]  Lee W. Schruben,et al.  Detecting Initialization Bias in Simulation Output , 1982, Oper. Res..

[9]  Radford M. Neal Annealed importance sampling , 1998, Stat. Comput..

[10]  Siddhartha Chib,et al.  Markov Chain Monte Carlo Methods for Generalized Stochastic Volatility Models , 2000 .

[11]  Mary Kathryn Cowles,et al.  A simulation approach to convergence rates for Markov chain Monte Carlo algorithms , 1998, Stat. Comput..

[12]  P. Fearnhead,et al.  The Random Walk Metropolis: Linking Theory and Practice Through a Case Study , 2010, 1011.6217.

[13]  Jeffrey S. Rosenthal,et al.  Analysis of the Gibbs Sampler for a Model Related to James-stein Estimators , 2007 .

[14]  Simon J. Godsill,et al.  A reversible jump sampler for autoregressive time series , 1997, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[15]  P. Green,et al.  Reversible jump MCMC , 2009 .

[16]  Sylvia Richardson,et al.  Markov Chain Monte Carlo in Practice , 1997 .

[17]  Lee W. Schruben,et al.  Optimal Tests for Initialization Bias in Simulation Output , 1983, Oper. Res..

[18]  William A. Link,et al.  Bayesian Multimodel Inference by RJMCMC: A Gibbs Sampling Approach , 2013 .

[19]  P. Green Reversible jump Markov chain Monte Carlo computation and Bayesian model determination , 1995 .

[20]  G. C. Wei,et al.  A Monte Carlo Implementation of the EM Algorithm and the Poor Man's Data Augmentation Algorithms , 1990 .

[21]  M. Tanner,et al.  Facilitating the Gibbs Sampler: The Gibbs Stopper and the Griddy-Gibbs Sampler , 1992 .

[22]  Philip Heidelberger,et al.  Simulation Run Length Control in the Presence of an Initial Transient , 1983, Oper. Res..

[23]  Radford M. Neal Slice Sampling , 2003, The Annals of Statistics.

[24]  A. Raftery,et al.  How Many Iterations in the Gibbs Sampler , 1991 .

[25]  F. Liang Continuous Contour Monte Carlo for Marginal Density Estimation With an Application to a Spatial Statistical Model , 2007 .

[26]  R. Tweedie,et al.  Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms , 1996 .

[27]  Zhihua Zhang,et al.  Learning a multivariate Gaussian mixture model with the reversible jump MCMC algorithm , 2004, Stat. Comput..

[28]  A. Zellner,et al.  Gibbs Sampler Convergence Criteria , 1995 .

[29]  Scott L. Zeger,et al.  Generalized linear models with random e ects: a Gibbs sampling approach , 1991 .

[30]  S. F. Jarner,et al.  Geometric ergodicity of Metropolis algorithms , 2000 .

[31]  A. Gelman,et al.  Weak convergence and optimal scaling of random walk Metropolis algorithms , 1997 .

[32]  Man-Suk Oh,et al.  Adaptive importance sampling in monte carlo integration , 1992 .

[33]  Richard L. Tweedie,et al.  Markov Chains and Stochastic Stability , 1993, Communications and Control Engineering Series.

[34]  Bradley P. Carlin,et al.  Markov Chain Monte Carlo conver-gence diagnostics: a comparative review , 1996 .

[35]  N. Yi,et al.  Bayesian mapping of quantitative trait loci for complex binary traits. , 2000, Genetics.

[36]  Gwilym M. Jenkins,et al.  Time series analysis, forecasting and control , 1971 .

[37]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[38]  Lancelot F. James,et al.  Bayesian Model Selection in Finite Mixtures by Marginal Density Decompositions , 2001 .

[39]  W. Gilks,et al.  Adaptive Rejection Sampling for Gibbs Sampling , 1992 .

[40]  Bin Yu,et al.  Looking at Markov samplers through cusum path plots: a simple diagnostic idea , 1998, Stat. Comput..

[41]  Miao-Yu Tsai,et al.  Reversible jump Markov chain Monte Carlo algorithms for Bayesian variable selection in logistic mixed models , 2018, Commun. Stat. Simul. Comput..

[42]  David B. Dunson,et al.  Bayesian Data Analysis , 2010 .

[43]  D. Rubin,et al.  Inference from Iterative Simulation Using Multiple Sequences , 1992 .

[44]  Bradley P. Carlin,et al.  An iterative Monte Carlo method for nonconjugate Bayesian analysis , 1991 .

[45]  David A. Spade Geometric ergodicity of a Metropolis-Hastings algorithm for Bayesian inference of phylogenetic branch lengths , 2020, Comput. Stat..

[46]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.