Convergence in Variance of Chebyshev Accelerated Gibbs Samplers

A stochastic version of a stationary linear iterative solver may be designed to converge in distribution to a probability distribution with a specified mean μ and covariance matrix A−1. A common example is Gibbs sampling applied to a multivariate Gaussian distribution which is a stochastic version of the Gauss-Seidel linear solver. The iteration operator that acts on the error in mean and covariance in the stochastic iteration is the same iteration operator that acts on the solution error in the linear solver, and thus both the stationary sampler and the stationary solver have the same error polynomial and geometric convergence rate. The polynomial acceleration techniques that are well known in numerical analysis for accelerating the linear solver may also be used to accelerate the stochastic iteration. We derive first-order and second-order Chebyshev polynomial acceleration for the stochastic iteration to accelerate convergence in the mean and covariance by mimicking the derivation for the linear solver. In particular, we show that the error polynomials are identical and hence so are the convergence rates. Thus, optimality of the Chebyshev accelerated solver implies optimality of the Chebyshev accelerated sampler. We give an algorithm for the stochastic version of the second-order Chebyshev accelerated SSOR (symmetric successive overrelaxation) iteration and provide numerical examples of sampling from multivariate Gaussian distributions to confirm that the desired convergence properties are achieved in finite precision.

[1]  O. Nevanlinna Convergence of Iterations for Linear Equations , 1993 .

[2]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Colin Fox,et al.  Sampling Gaussian Distributions in Krylov Spaces with Conjugate Gradients , 2012, SIAM J. Sci. Comput..

[4]  L. Fox,et al.  Chebyshev polynomials in numerical analysis , 1970 .

[5]  Louis A. Hageman,et al.  Iterative Solution of Large Linear Systems. , 1971 .

[6]  H. V. D. Vorst,et al.  The Convergence of Krylov Methods and Ritz Values , 2004 .

[7]  R. V. Southwell,et al.  Relaxation Methods in Theoretical Physics , 1947 .

[8]  O. Axelsson Iterative solution methods , 1995 .

[9]  Gene H. Golub,et al.  Matrix computations , 1983 .

[10]  H. Rue,et al.  An explicit link between Gaussian fields and Gaussian Markov random fields: the stochastic partial differential equation approach , 2011 .

[11]  Goodman,et al.  Multigrid Monte Carlo method. Conceptual foundations. , 1989, Physical review. D, Particles and fields.

[12]  G. B. Smith,et al.  Preface to S. Geman and D. Geman, “Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images” , 1987 .

[13]  J. Rosenthal,et al.  Optimal scaling for various Metropolis-Hastings algorithms , 2001 .

[14]  Alain Galli,et al.  Rate of Convergence of the Gibbs Sampler in the Gaussian Case , 2001 .

[15]  Stephen S. Wilson,et al.  Random iterative models , 1996 .

[16]  Yousef Saad,et al.  Iterative methods for sparse linear systems , 2003 .

[17]  R. Varga,et al.  Chebyshev semi-iterative methods, successive overrelaxation iterative methods, and second order Richardson iterative methods , 1961 .

[18]  H. Haario,et al.  An adaptive Metropolis algorithm , 2001 .

[19]  Gene H. Golub,et al.  Matrix computations (3rd ed.) , 1996 .

[20]  Richard W. Hamming,et al.  Numerical methods for scientists and engineers (2nd ed.) , 1986 .

[21]  Peter Guttorp,et al.  On the Whittle-Matérn correlation family , 2005 .

[22]  C. Fox,et al.  Accelerated Gibbs sampling of normal distributions using matrix splittings and polynomials , 2015, 1505.03512.

[23]  Stochastic Relaxation , 2014, Computer Vision, A Reference Guide.

[24]  W. G. Bickley,et al.  Relaxation Methods in Theoretical Physics , 1947 .

[25]  Tiangang Cui,et al.  Bayesian calibration of a large‐scale geothermal reservoir model by a new adaptive delayed acceptance Metropolis Hastings algorithm , 2011 .

[26]  P. Whittle ON STATIONARY PROCESSES IN THE PLANE , 1954 .

[27]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[28]  G. Roberts,et al.  Updating Schemes, Correlation Structure, Blocking and Parameterization for the Gibbs Sampler , 1997 .

[29]  G. Meurant The Lanczos and Conjugate Gradient Algorithms: From Theory to Finite Precision Computations , 2006 .