Amortized Monte Carlo Integration

Current approaches to amortizing Bayesian inference focus solely on approximating the posterior distribution. Typically, this approximation is, in turn, used to calculate expectations for one or more target functions - a computational pipeline which is inefficient when the target function(s) are known upfront. In this paper, we address this inefficiency by introducing AMCI, a method for amortizing Monte Carlo integration directly. AMCI operates similarly to amortized inference but produces three distinct amortized proposals, each tailored to a different component of the overall expectation calculation. At runtime, samples are produced separately from each amortized proposal, before being combined to an overall estimate of the expectation. We show that while existing approaches are fundamentally limited in the level of accuracy they can achieve, AMCI can theoretically produce arbitrarily small errors for any integrable target function using only a single sample from each proposal at runtime. We further show that it is able to empirically outperform the theoretically optimal self-normalized importance sampler on a number of example problems. Furthermore, AMCI allows not only for amortizing over datasets but also amortizing over target functions.

[1]  Man-Suk Oh,et al.  Adaptive importance sampling in monte carlo integration , 1992 .

[2]  Leonidas J. Guibas,et al.  Optimally combining sampling techniques for Monte Carlo rendering , 1995, SIGGRAPH.

[3]  M. Evans,et al.  Methods for Approximating Integrals in Statistics with Special Emphasis on Bayesian Integration Problems , 1995 .

[4]  Xiao-Li Meng,et al.  SIMULATING RATIOS OF NORMALIZING CONSTANTS VIA A SIMPLE IDENTITY: A THEORETICAL EXPLORATION , 1996 .

[5]  Ming-Hui Chen,et al.  On Monte Carlo methods for estimating ratios of normalizing constants , 1997 .

[6]  Xiao-Li Meng,et al.  Simulating Normalizing Constants: From Importance Sampling to Bridge Sampling to Path Sampling , 1998 .

[7]  P. Hahnfeldt,et al.  Tumor development under angiogenic signaling: a dynamical theory of tumor growth, treatment response, and postvascular dormancy. , 1999, Cancer research.

[8]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[9]  Robert L. Wolpert,et al.  Monte Carlo Integration in Bayesian Statistical Analysis , 2007 .

[10]  Christian P. Robert,et al.  The Bayesian choice : from decision-theoretic foundations to computational implementation , 2007 .

[11]  Zoubin Ghahramani,et al.  Approximate inference for the loss-calibrated Bayesian , 2011, AISTATS.

[12]  Chong Wang,et al.  Stochastic variational inference , 2012, J. Mach. Learn. Res..

[13]  Noah D. Goodman,et al.  Learning Stochastic Inverses , 2013, NIPS.

[14]  Daan Wierstra,et al.  Stochastic Backpropagation and Approximate Inference in Deep Generative Models , 2014, ICML.

[15]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[16]  M. Chaplain,et al.  Mathematical modeling of tumor growth and treatment. , 2014, Current pharmaceutical design.

[17]  Shakir Mohamed,et al.  Variational Inference with Normalizing Flows , 2015, ICML.

[18]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[19]  Noah D. Goodman,et al.  Deep Amortized Inference for Probabilistic Programs , 2016, ArXiv.

[20]  Frank D. Wood,et al.  Inference Networks for Sequential Monte Carlo in Graphical Models , 2016, ICML.

[21]  Petar M. Djuric,et al.  Adaptive Importance Sampling: The past, the present, and the future , 2017, IEEE Signal Processing Magazine.

[22]  Iain Murray,et al.  Masked Autoregressive Flow for Density Estimation , 2017, NIPS.

[23]  Tom Rainforth,et al.  Automating inference, learning, and design using probabilistic programming , 2017 .

[24]  Frank D. Wood,et al.  Inference Compilation and Universal Probabilistic Programming , 2016, AISTATS.

[25]  Prafulla Dhariwal,et al.  Glow: Generative Flow with Invertible 1x1 Convolutions , 2018, NeurIPS.

[26]  Yee Whye Teh,et al.  Revisiting Reweighted Wake-Sleep , 2018, ArXiv.

[27]  David Duvenaud,et al.  Inference Suboptimality in Variational Autoencoders , 2018, ICML.

[28]  Yee Whye Teh,et al.  Inference Trees: Adaptive Inference with Exploration , 2018, 1806.09550.

[29]  Yee Whye Teh,et al.  Faithful Inversion of Generative Models for Effective Amortized Inference , 2017, NeurIPS.

[30]  François Septier,et al.  A Double Proposal Normalized Importance Sampling Estimator , 2018, 2018 IEEE Statistical Signal Processing Workshop (SSP).

[31]  Hongseok Yang,et al.  On Nesting Monte Carlo Estimators , 2017, ICML.

[32]  David Duvenaud,et al.  FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models , 2018, ICLR.

[33]  Luca Martino,et al.  Advances in Importance Sampling , 2021, Wiley StatsRef: Statistics Reference Online.