Fast $\epsilon$-free Inference of Simulation Models with Bayesian Conditional Density Estimation

Many statistical models can be simulated forwards but have intractable likelihoods. Approximate Bayesian Computation (ABC) methods are used to infer properties of these models from data. Traditionally these methods approximate the posterior over parameters by conditioning on data being inside an e-ball around the observed data, which is only correct in the limit e→0. Monte Carlo methods can then draw samples from the approximate posterior to approximate predictions or error bars on parameters. These algorithms critically slow down as e→0, and in practice draw samples from a broader distribution than the posterior. We propose a new approach to likelihood-free inference based on Bayesian conditional density estimation. Preliminary inferences based on limited simulation data are used to guide later simulations. In some cases, learning an accurate parametric representation of the entire true posterior distribution requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.

[1]  D. Gillespie Exact Stochastic Simulation of Coupled Chemical Reactions , 1977 .

[2]  S. Srihari Mixture Density Networks , 1994 .

[3]  Geoffrey E. Hinton,et al.  The Helmholtz Machine , 1995, Neural Computation.

[4]  P. M. Williams,et al.  Using Neural Networks to Model Conditional Multivariate Densities , 1996, Neural Computation.

[5]  M. Feldman,et al.  Population growth of human Y chromosomes: a study of Y chromosome microsatellites. , 1999, Molecular biology and evolution.

[6]  Quaid Morris,et al.  Recognition Networks for Approximate Inference in BN20 Networks , 2001, UAI.

[7]  D. Balding,et al.  Approximate Bayesian computation in population genetics. , 2002, Genetics.

[8]  Paul Marjoram,et al.  Markov chain Monte Carlo without likelihoods , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[9]  Darren J. Wilkinson Stochastic Modelling for Systems Biology , 2006 .

[10]  Geoffrey E. Hinton,et al.  Analysis-by-Synthesis by Learning to Invert Generative Black Boxes , 2008, ICANN.

[11]  C. Robert,et al.  Adaptive approximate Bayesian computation , 2008, 0805.2256.

[12]  S. Wood Statistical inference for noisy nonlinear ecological dynamic systems , 2010, Nature.

[13]  Madeleine B. Thompson A Comparison of Methods for Computing Autocorrelation Time , 2010, 1011.0175.

[14]  Olivier François,et al.  Non-linear regression models for Approximate Bayesian Computation , 2008, Stat. Comput..

[15]  Nicolas Chopin,et al.  Expectation Propagation for Likelihood-Free Inference , 2011, 1107.5959.

[16]  P. Freeman,et al.  Likelihood-Free Inference in Cosmology: Potential for the Estimation of Luminosity Functions , 2012 .

[17]  Christian P. Robert,et al.  Stochastic Modelling for Systems Biology (second edition) , 2012 .

[18]  D. J. Nott,et al.  Approximate Bayesian computation via regression density estimation , 2012, 1212.1479.

[19]  Daan Wierstra,et al.  Stochastic Backpropagation and Approximate Inference in Deep Generative Models , 2014, ICML.

[20]  Radford M. Neal,et al.  On Bayesian inference for the M/G/1 queue with efficient MCMC sampling , 2014, 1401.5548.

[21]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[22]  Max Welling,et al.  GPS-ABC: Gaussian Process Surrogate Approximate Bayesian Computation , 2014, UAI.

[23]  Mike West,et al.  Sequential Monte Carlo with Adaptive Weights for Approximate Bayesian Computation , 2015, 1503.07791.

[24]  Richard E. Turner,et al.  Neural Adaptive Sequential Monte Carlo , 2015, NIPS.

[25]  Jiajun Wu,et al.  Galileo: Perceiving Physical Object Properties by Integrating a Physics Engine with Deep Learning , 2015, NIPS.

[26]  Robert Leenders,et al.  Hamiltonian ABC , 2015, UAI.

[27]  Edward Meeds,et al.  Optimization Monte Carlo: Efficient and Embarrassingly Parallel Likelihood-Free Inference , 2015, Neural Information Processing Systems.

[28]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[29]  Iain Murray,et al.  Distilling Intractable Generative Models , 2015 .

[30]  Ariel D. Procaccia,et al.  Variational Dropout and the Local Reparameterization Trick , 2015, NIPS.

[31]  Michael U. Gutmann,et al.  Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models , 2015, J. Mach. Learn. Res..

[32]  John Salvatier,et al.  Theano: A Python framework for fast computation of mathematical expressions , 2016, ArXiv.

[33]  Frank D. Wood,et al.  Inference Networks for Sequential Monte Carlo in Graphical Models , 2016, ICML.