Discrete-Continuous Mixtures in Probabilistic Programming: Generalized Semantics and Inference Algorithms

Despite the recent successes of probabilistic programming languages (PPLs) in AI applications, PPLs offer only limited support for random variables whose distributions combine discrete and continuous elements. We develop the notion of measure-theoretic Bayesian networks (MTBNs) and use it to provide more general semantics for PPLs with arbitrarily many random variables defined over arbitrary measure spaces. We develop two new general sampling algorithms that are provably correct under the MTBN framework: the lexicographic likelihood weighting (LLW) for general MTBNs and the lexicographic particle filter (LPF), a specialized algorithm for state-space models. We further integrate MTBNs into a widely used PPL system, BLOG, and verify the effectiveness of the new inference algorithms through representative examples.

[1]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[2]  R. Durrett Probability: Theory and Examples , 1993 .

[3]  O. Kallenberg Foundations of Modern Probability , 2021, Probability Theory and Stochastic Modelling.

[4]  David A. McAllester,et al.  Effective Bayesian Inference for Stochastic Programs , 1997, AAAI/IAAI.

[5]  Nando de Freitas,et al.  An Introduction to Sequential Monte Carlo Methods , 2001, Sequential Monte Carlo Methods in Practice.

[6]  Freda Kemp,et al.  An Introduction to Sequential Monte Carlo Methods , 2003 .

[7]  Stuart J. Russell,et al.  Approximate Inference for Infinite Contingent Bayesian Networks , 2005, AISTATS.

[8]  Stuart J. Russell,et al.  BLOG: Probabilistic Models with Unknown Objects , 2005, IJCAI.

[9]  Stuart J. Russell,et al.  Probabilistic models with unknown objects , 2006 .

[10]  Ben Taskar,et al.  Bayesian Logic Programming: Theory and Tool , 2007 .

[11]  Pedro M. Domingos,et al.  Markov Logic in Infinite Domains , 2007, UAI.

[12]  F. Stephan,et al.  Set theory , 2018, Mathematical Statistics with Applications in R.

[13]  Luc De Raedt,et al.  Bayesian Logic Programming: Theory and Tool , 2007 .

[14]  Pedro M. Domingos,et al.  Hybrid Markov Logic Networks , 2008, AAAI.

[15]  Joshua B. Tenenbaum,et al.  Church: a language for generative models , 2008, UAI.

[16]  David A. McAllester,et al.  Random-World Semantics and Syntactic Independence for Expressive Languages , 2008 .

[17]  A. Pfeffer Figaro : An Object-Oriented Probabilistic Programming Language , 2009 .

[18]  David J. Hill,et al.  Lifted Inference for Relational Continuous Models , 2010, Statistical Relational Artificial Intelligence.

[19]  Luc De Raedt,et al.  Extending ProbLog with Continuous Distributions , 2010, ILP.

[20]  Luc De Raedt,et al.  The magic of logical inference in probabilistic programming , 2011, Theory and Practice of Logic Programming.

[21]  Erik B. Sudderth,et al.  NET‐VISA: Network Processing Vertically Integrated Seismic Analysis , 2013 .

[22]  Noah D. Goodman The principles and practice of probabilistic programming , 2013, POPL.

[23]  P. Kharchenko,et al.  Bayesian approach to single-cell differential expression analysis , 2014, Nature Methods.

[24]  Frank D. Wood,et al.  A New Approach to Probabilistic Programming Inference , 2014, AISTATS.

[25]  Stuart J. Russell,et al.  BFiT: From Possible-World Semantics to Random-Evaluation Semantics in an Open Universe , 2014 .

[26]  Noah D. Goodman,et al.  Reasoning about reasoning by nested conditioning: Modeling theory of mind with probabilistic programs , 2014, Cognitive Systems Research.

[27]  Yura N. Perov,et al.  Venture: a higher-order probabilistic programming platform with programmable inference , 2014, ArXiv.

[28]  E. Pierson,et al.  ZIFA: Dimensionality reduction for zero-inflated single-cell gene expression analysis , 2015, Genome Biology.

[29]  Joshua B. Tenenbaum,et al.  Human-level concept learning through probabilistic program induction , 2015, Science.

[30]  N. Ramsey All You Need is the Monad . . . What Monad Was That Again ? , 2015 .

[31]  Joshua B. Tenenbaum,et al.  Picture: A probabilistic programming language for scene perception , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[32]  Pat Hanrahan,et al.  Generating Design Suggestions under Tight Constraints with Gradient‐based Probabilistic Programming , 2015, Comput. Graph. Forum.

[33]  Lei Li,et al.  Swift: Compiled Inference for Probabilistic Programming Languages , 2016, IJCAI.

[34]  Luc De Raedt,et al.  Probabilistic logic programming for hybrid relational domains , 2016, Machine Learning.

[35]  Dustin Tran,et al.  Edward: A library for probabilistic modeling, inference, and criticism , 2016, ArXiv.

[36]  David Tolpin,et al.  Design and Implementation of Probabilistic Programming Language Anglican , 2016, IFL 2016.

[37]  Sreeram Kannan,et al.  Estimating Mutual Information for Discrete-Continuous Mixtures , 2017, NIPS.

[38]  Stuart J. Russell,et al.  Signal-based Bayesian Seismic Monitoring , 2017, AISTATS.

[39]  Jiqiang Guo,et al.  Stan: A Probabilistic Programming Language. , 2017, Journal of statistical software.

[40]  Norman Ramsey,et al.  Exact Bayesian inference by symbolic disintegration , 2017, POPL.

[41]  Sam Staton,et al.  Commutative Semantics for Probabilistic Programming , 2017, ESOP.

[42]  Frank D. Wood,et al.  Inference Compilation and Universal Probabilistic Programming , 2016, AISTATS.