Distributed and accelerated inference algorithms for probabilistic graphical models

Learning graphical models from data is of fundamental importance in machine learning and statistics; however, this task is often computationally challenging due to the complexity of the models and the large scale of the data sets involved. This dissertation presents a variety of distributed and accelerated inference algorithms for probabilistic graphical models. The first part of this dissertation focuses on a class of directed latent variable models known as topic models. We introduce synchronous and asynchronous distributed algorithms for topic models which yield significant time and memory savings without sacrificing accuracy. We also investigate various approximate inference techniques for topic models, including collapsed Gibbs sampling, variational inference, and maximum a posteriori estimation and find that these methods learn models of similar accuracy as long as hyperparameters are optimized, giving us the freedom to utilize the most computationally efficient algorithm. The second part of this dissertation focuses on accelerated parameter estimation techniques for undirected models such as Boltzmann machines and exponential random graph models. We investigate an efficient blocked contrastive divergence approach that is based on the composite likelihood framework. We also present a particle filtering approach for approximate maximum likelihood estimation that is able to outperform previously proposed estimation algorithms.