Clone MCMC: Parallel High-Dimensional Gaussian Gibbs Sampling

We propose a generalized Gibbs sampler algorithm for obtaining samples approximately distributed from a high-dimensional Gaussian distribution. Similarly to Hogwild methods, our approach does not target the original Gaussian distribution of interest, but an approximation to it. Contrary to Hogwild methods, a single parameter allows us to trade bias for variance. We show empirically that our method is very flexible and performs well compared to Hogwild-type algorithms.

[1]  H. Rue Fast sampling of Gaussian Markov random fields , 2000 .

[2]  Jean-François Giovannelli,et al.  Sampling High-Dimensional Gaussian Distributions for General Linear Inverse Problems , 2012, IEEE Signal Processing Letters.

[3]  Tilmann Gneiting,et al.  Calibrated Probabilistic Mesoscale Weather Field Forecasting , 2004 .

[4]  John Langford,et al.  Scaling up machine learning: parallel and distributed approaches , 2011, KDD '11 Tutorials.

[5]  Matthew J. Johnson,et al.  Analyzing Hogwild Parallel Gaussian Gibbs Sampling , 2013, NIPS.

[6]  C. Fox,et al.  Accelerated Gibbs sampling of normal distributions using matrix splittings and polynomials , 2015, 1505.03512.

[7]  George Papandreou,et al.  Gaussian sampling by local perturbations , 2010, NIPS.

[8]  Saïd Moussaoui,et al.  Efficient Gaussian Sampling for Solving Large-Scale Inverse Problems Using MCMC , 2014, IEEE Transactions on Signal Processing.

[9]  Gene H. Golub,et al.  Matrix computations , 1983 .

[10]  Max Welling,et al.  Distributed Inference for Latent Dirichlet Allocation , 2007, NIPS.

[11]  G. Roberts,et al.  Updating Schemes, Correlation Structure, Blocking and Parameterization for the Gibbs Sampler , 1997 .

[12]  S. Adler Over-relaxation method for the Monte Carlo evaluation of the partition function for multiquadratic actions , 1981 .

[13]  P. Barone,et al.  Improving Stochastic Relaxation for Gussian Random Fields , 1990, Probability in the Engineering and Informational Sciences.