Convergence of Markov chain Monte Carlo algorithms with applications to image restoration

Markov chain Monte Carlo algorithms, such as the Gibbs sampler and Metropolis-Hastings algorithm, are widely used in statistics, computer science, chemistry and physics for exploring complicated probability distributions. A critical issue for users of these algorithms is the determination of the number of iterations required so that the result will be approximately a sample from the distribution of interest. In this thesis, we give precise bounds on the convergence time of the Gibbs sampler used in the Bayesian restoration of a degraded image. We consider convergence as measured by both the usual choice of metric, total variation distance, and the Wasserstein metric. In both cases we exploit the coupling characterisation of the metric to get our results. Our results can also be applied to the coupling-from-the-past algorithm of Propp and Wilson (1996) to get bounds on its running time. The application of our theoretical results requires the computation of parameters of the algorithm. These computations may be prohibitively difficult in many situations. We discuss how our results can be applied in these situations through the use of auxiliary simulation to estimate these parameters. We also give a summary of probability metrics and the relationships between them, including several new relationships.

[1]  Geoff K. Nicholls,et al.  Perfect simulation for sample-based inference , 1999 .

[2]  Gareth O. Roberts,et al.  Markov‐chain monte carlo: Some practical implications of theoretical results , 1998 .

[3]  Jeffrey S. Rosenthal,et al.  Convergence Rates for Markov Chains , 1995, SIAM Rev..

[4]  J. Besag,et al.  Spatial Statistics and Bayesian Computation , 1993 .

[5]  Ronald L. Wasserstein,et al.  Monte Carlo: Concepts, Algorithms, and Applications , 1997 .

[6]  David Bruce Wilson,et al.  Exact sampling with coupled Markov chains and applications to statistical mechanics , 1996, Random Struct. Algorithms.

[8]  J. Rosenthal,et al.  Possible biases induced by MCMC convergence diagnostics , 1999 .

[9]  T. Lindvall Lectures on the Coupling Method , 1992 .

[10]  C. Hwang,et al.  Convergence rates of the Gibbs sampler, the Metropolis algorithm and other single-site updating dynamics , 1993 .

[11]  M. Piccioni,et al.  Importance sampling for families of distributions , 1999 .

[12]  L. Tierney Markov Chains for Exploring Posterior Distributions , 1994 .

[13]  J. Møller Perfect simulation of conditionally specified models , 1999 .

[14]  P. Diaconis,et al.  Geometric Bounds for Eigenvalues of Markov Chains , 1991 .

[15]  J. Bernardo Bayesian statistics 6 : proceedings of the Sixth Valencia International Meeting, June 6-10, 1998 , 1999 .

[16]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Mark Jerrum,et al.  Polynomial-Time Approximation Algorithms for the Ising Model , 1990, SIAM J. Comput..

[18]  James Allen Fill,et al.  An interruptible algorithm for perfect sampling via Markov chains , 1997, STOC '97.

[19]  R. Tweedie,et al.  Perfect simulation and backward coupling , 1998 .

[20]  F. Martinelli Lectures on Glauber dynamics for discrete spin models , 1999 .

[21]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[22]  Dana Randall,et al.  Markov Chain Algorithms for Planar Lattice Structures (Extended Abstract). , 1995, FOCS 1995.

[23]  J. A. Cuesta-Albertos,et al.  A characterization for the solution of the Monge--Kantorovich mass transference problem , 1993 .

[24]  Ronald A. Thisted,et al.  Elements of statistical computing , 1986 .

[25]  Ludger Rüschendorf,et al.  Distributions with fixed marginals and related topics , 1999 .

[26]  Walter R. Gilks,et al.  Introduction to general state-space Markov chain theory , 1995 .

[27]  Mary Kathryn Cowles,et al.  A simulation approach to convergence rates for Markov chain Monte Carlo algorithms , 1998, Stat. Comput..

[28]  L. L. Cam,et al.  Asymptotic Methods In Statistical Decision Theory , 1986 .

[29]  P. Green,et al.  Metropolis Methods, Gaussian Proposals and Antithetic Variables , 1992 .

[30]  Peter Green,et al.  Markov chain Monte Carlo in Practice , 1996 .

[31]  B. Lindsay Efficiency versus robustness : the case for minimum Hellinger distance and related methods , 1994 .

[32]  R. Reiss Approximate Distributions of Order Statistics , 1989 .

[33]  F. Su Convergence of random walks on the circle generated by an irrational rotation , 1998 .

[34]  P. Green,et al.  Exact Sampling from a Continuous State Space , 1998 .

[35]  S. Rachev The Monge–Kantorovich Mass Transference Problem and Its Stochastic Applications , 1985 .

[36]  P. Diaconis,et al.  Updating Subjective Probability , 1982 .

[37]  P. Diaconis Group representations in probability and statistics , 1988 .

[38]  P. Diaconis,et al.  COMPARISON THEOREMS FOR REVERSIBLE MARKOV CHAINS , 1993 .

[39]  Walter R. Gilks,et al.  MCMC in image analysis , 1995 .

[40]  Nicholas G. Polson,et al.  Sampling from log-concave distributions , 1994 .

[41]  S. Rosenthal,et al.  A review of asymptotic convergence for general state space Markov chains , 2002 .

[42]  Dudley,et al.  Real Analysis and Probability: Measurability: Borel Isomorphism and Analytic Sets , 2002 .

[43]  M. F.,et al.  Bibliography , 1985, Experimental Gerontology.

[44]  Mark Jerrum,et al.  Approximate Counting, Uniform Generation and Rapidly Mixing Markov Chains , 1987, WG.

[45]  Bradley P. Carlin,et al.  Markov Chain Monte Carlo conver-gence diagnostics: a comparative review , 1996 .

[46]  Adrian F. M. Smith,et al.  Bayesian computation via the gibbs sampler and related markov chain monte carlo methods (with discus , 1993 .

[47]  J. N. Corcoran,et al.  Perfect Sampling of Harris Recurrent Markov Chains , 1999 .

[48]  B. Cipra An introduction to the Ising model , 1987 .

[49]  Feller William,et al.  An Introduction To Probability Theory And Its Applications , 1950 .

[50]  Peter Green,et al.  Exact sampling for Bayesian inference: towards general purpose algorithms , 1998 .

[51]  A. Szulga On Minimal Metrics in the Space of Random Variables , 1983 .

[52]  Adrian F. M. Smith,et al.  Sampling-Based Approaches to Calculating Marginal Densities , 1990 .

[53]  J. Besag,et al.  Bayesian Computation and Stochastic Systems , 1995 .

[54]  R. Durrett Probability: Theory and Examples , 1993 .

[55]  R. Tweedie,et al.  Rates of convergence of the Hastings and Metropolis algorithms , 1996 .

[56]  P. Diaconis,et al.  Strong uniform times and finite random walks , 1987 .

[57]  R. Graham,et al.  Random Walks Arising in Random Number Generation , 1987 .

[58]  Dana Randall,et al.  Markov Chain Algorithms for Planar Lattice Structures , 2001, SIAM J. Comput..

[59]  Rick Durrett Probability Metrics and the Stability of Stochastic Models (Sveltozar T. Racheu) , 1992, SIAM Rev..

[60]  L. Cam,et al.  Théorie asymptotique de la décision statistique , 1969 .

[61]  D. Aldous Random walks on finite groups and rapidly mixing markov chains , 1983 .

[62]  D. Murdoch Exact Sampling for Bayesian Inference: Unbounded State Spaces , 2000 .

[63]  Alistair Sinclair,et al.  Improved Bounds for Mixing Rates of Markov Chains and Multicommodity Flow , 1992, Combinatorics, Probability and Computing.

[64]  Jim Freeman Probability Metrics and the Stability of Stochastic Models , 1991 .

[65]  A. Frigessi,et al.  Computational complexity of Markov chain Monte Carlo methods for finite Markov random fields , 1997 .

[66]  S. Ingrassia ON THE RATE OF CONVERGENCE OF THE METROPOLIS ALGORITHM AND GIBBS SAMPLER BY GEOMETRIC BOUNDS , 1994 .

[67]  Richard L. Tweedie,et al.  Markov Chains and Stochastic Stability , 1993, Communications and Control Engineering Series.

[68]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[69]  Mary Kathryn Cowles MCMC Sampler Convergence Rates for Hierarchical Normal Linear Models: A Simulation Approach , 2002, Stat. Comput..

[70]  David Bruce Wilson,et al.  How to Get a Perfectly Random Sample from a Generic Markov Chain and Generate a Random Spanning Tree of a Directed Graph , 1998, J. Algorithms.

[71]  Patrick Billingsley,et al.  Probability and Measure. , 1986 .

[72]  J. Besag On the Statistical Analysis of Dirty Pictures , 1986 .

[73]  J. Rosenthal,et al.  Convergence of Slice Sampler Markov Chains , 1999 .