Lifted Message Passing for Hybrid Probabilistic Inference

Lifted inference algorithms for first-order logic models, e.g., Markov logic networks (MLNs), have been of significant interest in recent years. Lifted inference methods exploit model symmetries in order to reduce the size of the model and, consequently, the computational cost of inference. In this work, we consider the problem of lifted inference in MLNs with continuous or both discrete and continuous groundings. Existing work on lifting with continuous groundings has mostly been limited to special classes of models, e.g., Gaussian models, for which variable elimination or message-passing updates can be computed exactly. Here, we develop approximate lifted inference schemes based on particle sampling. We demonstrate empirically that our approximate lifting schemes perform comparably to existing state-of-the-art models for Gaussian MLNs, while having the flexibility to be applied to models with arbitrary potential functions.

[1]  Pedro M. Domingos,et al.  Probabilistic theorem proving , 2011, UAI.

[2]  Vibhav Gogate,et al.  Evidence-Based Clustering for Scalable Inference in Markov Logic , 2014, ECML/PKDD.

[3]  Jaesik Choi,et al.  Learning Relational Kalman Filtering , 2015, AAAI.

[4]  Michael I. Jordan,et al.  Loopy Belief Propagation for Approximate Inference: An Empirical Study , 1999, UAI.

[5]  Guy Van den Broeck Lifted Inference and Learning in Statistical Relational Models , 2013 .

[6]  Daphne Koller,et al.  Probabilistic Relational Models , 1999, ILP.

[7]  Dinh Phung,et al.  Journal of Machine Learning Research: Preface , 2014 .

[8]  Dan Roth,et al.  Lifted First-Order Probabilistic Inference , 2005, IJCAI.

[9]  Somdeb Sarkhel,et al.  Fast Lifted MAP Inference via Partitioning , 2015, NIPS.

[10]  William T. Freeman,et al.  Nonparametric belief propagation , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[11]  David J. Hill,et al.  Lifted Inference for Relational Continuous Models , 2010, Statistical Relational Artificial Intelligence.

[12]  Nicholas Ruozzi,et al.  Marginal Inference in Continuous Markov Random Fields Using Mixtures , 2019, AAAI.

[13]  Yee Whye Teh,et al.  Expectation Particle Belief Propagation , 2015, NIPS.

[14]  Parag Singla,et al.  Coarse-to-Fine Lifted MAP Inference in Computer Vision , 2017, IJCAI.

[15]  David A. McAllester,et al.  Particle Belief Propagation , 2009, AISTATS.

[16]  A. Glavieux,et al.  Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[17]  Mathias Niepert Lifted Probabilistic Inference: An MCMC Perspective , 2012, StarAI@UAI.

[18]  William T. Freeman,et al.  Correctness of Belief Propagation in Gaussian Graphical Models of Arbitrary Topology , 1999, Neural Computation.

[19]  Tom Minka,et al.  Expectation Propagation for approximate Bayesian inference , 2001, UAI.

[20]  Kristian Kersting,et al.  Counting Belief Propagation , 2009, UAI.

[21]  Jaesik Choi,et al.  Lifted Relational Kalman Filtering , 2011, IJCAI.

[22]  Ben Taskar,et al.  Introduction to statistical relational learning , 2007 .

[23]  Scott Sanner,et al.  Multi-Evidence Lifted Message Passing, with Application to PageRank and the Kalman Filter , 2011, IJCAI.

[24]  Sekhar Tatikonda,et al.  Message-passing algorithms for quadratic minimization , 2012, J. Mach. Learn. Res..

[25]  Luc De Raedt,et al.  Statistical Relational Artificial Intelligence: Logic, Probability, and Computation , 2016, Statistical Relational Artificial Intelligence.

[26]  Pedro M. Domingos,et al.  Lifted First-Order Belief Propagation , 2008, AAAI.

[27]  Matthew Richardson,et al.  Markov logic networks , 2006, Machine Learning.

[28]  Pedro M. Domingos,et al.  Hybrid Markov Logic Networks , 2008, AAAI.

[29]  David Poole,et al.  First-order probabilistic inference , 2003, IJCAI.