HOGWILD!-Gibbs can be PanAccurate

Asynchronous Gibbs sampling has been recently shown to be fast-mixing and an accurate method for estimating probabilities of events on a small number of variables of a graphical model satisfying Dobrushin's condition~\cite{DeSaOR16}. We investigate whether it can be used to accurately estimate expectations of functions of {\em all the variables} of the model. Under the same condition, we show that the synchronous (sequential) and asynchronous Gibbs samplers can be coupled so that the expected Hamming distance between their (multivariate) samples remains bounded by $O(\tau \log n),$ where $n$ is the number of variables in the graphical model, and $\tau$ is a measure of the asynchronicity. A similar bound holds for any constant power of the Hamming distance. Hence, the expectation of any function that is Lipschitz with respect to a power of the Hamming distance, can be estimated with a bias that grows logarithmically in $n$. Going beyond Lipschitz functions, we consider the bias arising from asynchronicity in estimating the expectation of polynomial functions of all variables in the model. Using recent concentration of measure results~\cite{DaskalakisDK17,GheissariLP17,GotzeSS18}, we show that the bias introduced by the asynchronicity is of smaller order than the standard deviation of the function value already present in the true model. We perform experiments on a multi-processor machine to empirically illustrate our theoretical findings.

[1]  Glenn Ellison Learning, Local Interaction, and Coordination , 1993 .

[2]  Christopher De Sa,et al.  Ensuring Rapid Mixing and Low Bias for Asynchronous Gibbs Sampling , 2016, ICML.

[3]  Dimitris S. Papailiopoulos,et al.  Perturbed Iterate Analysis for Asynchronous Stochastic Optimization , 2015, SIAM J. Optim..

[4]  Andrea Montanari,et al.  The spread of innovations in social networks , 2010, Proceedings of the National Academy of Sciences.

[5]  Stuart Geman,et al.  Markov Random Field Image Models and Their Applications to Computer Vision , 2010 .

[6]  Simon Osindero,et al.  Dogwild! – Distributed Hogwild for CPU & GPU , 2014 .

[7]  Alexander J. Smola,et al.  An architecture for parallel topic models , 2010, Proc. VLDB Endow..

[8]  Kunle Olukotun,et al.  Taming the Wild: A Unified Analysis of Hogwild-Style Algorithms , 2015, NIPS.

[9]  Christopher Ré,et al.  DimmWitted: A Study of Main-Memory Statistical Analytics , 2014, Proc. VLDB Endow..

[10]  Stephen J. Wright,et al.  An asynchronous parallel stochastic coordinate descent algorithm , 2013, J. Mach. Learn. Res..

[11]  Constantinos Daskalakis,et al.  Concentration of Multilinear Functions of the Ising Model with Applications to Network Data , 2017, NIPS.

[12]  Stephen J. Wright,et al.  Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent , 2011, NIPS.

[13]  Alexandros G. Dimakis,et al.  FrogWild! - Fast PageRank Approximations on Graph Engines , 2015, Proc. VLDB Endow..

[14]  Y. Peres,et al.  Concentration inequalities for polynomials of contracting Ising models , 2017, 1706.00121.

[15]  Matthew J. Johnson,et al.  Analyzing Hogwild Parallel Gaussian Gibbs Sampling , 2013, NIPS.

[16]  Inderjit S. Dhillon,et al.  Scalable Coordinate Descent Approaches to Parallel Matrix Factorization for Recommender Systems , 2012, 2012 IEEE 12th International Conference on Data Mining.

[17]  Elchanan Mossel,et al.  Evolutionary trees and the Ising model on the Bethe lattice: a proof of Steel’s conjecture , 2005, ArXiv.

[18]  Holger Sambale,et al.  Higher order concentration for functions of weakly dependent random variables , 2018, Electronic Journal of Probability.

[19]  Daniel Simpson,et al.  Asynchronous Gibbs Sampling , 2015, AISTATS.