Stochastic Gradient Bayesian Optimal Experimental Designs for Simulation-based Inference

Simulation-based inference (SBI) methods tackle complex scientific models with challenging inverse problems. However, SBI models often face a significant hurdle due to their non-differentiable nature, which hampers the use of gradient-based optimization techniques. Bayesian Optimal Experimental Design (BOED) is a powerful approach that aims to make the most efficient use of experimental resources for improved inferences. While stochastic gradient BOED methods have shown promising results in high-dimensional design problems, they have mostly neglected the integration of BOED with SBI due to the difficult non-differentiable property of many SBI simulators. In this work, we establish a crucial connection between ratio-based SBI inference algorithms and stochastic gradient-based variational inference by leveraging mutual information bounds. This connection allows us to extend BOED to SBI applications, enabling the simultaneous optimization of experimental designs and amortized inference functions. We demonstrate our approach on a simple linear model and offer implementation details for practitioners.

[1]  Sebastian M. Schmon,et al.  Investigating the Impact of Model Misspecification in Neural Simulation-based Inference , 2022, ArXiv.

[2]  Michael U. Gutmann,et al.  Gradient-based Bayesian Experimental Design for Implicit Models using Mutual Information Lower Bounds , 2021, ArXiv.

[3]  Jan Boelts,et al.  Benchmarking Simulation-Based Inference , 2021, AISTATS.

[4]  S. Kleinegesse,et al.  Bayesian Experimental Design for Implicit Models by Mutual Information Neural Estimation , 2020, ICML.

[5]  Iain Murray,et al.  On Contrastive Learning for Likelihood-free Inference , 2020, ICML.

[6]  Eric Nalisnick,et al.  Normalizing Flows for Probabilistic Modeling and Inference , 2019, J. Mach. Learn. Res..

[7]  Gilles Louppe,et al.  The frontier of simulation-based inference , 2019, Proceedings of the National Academy of Sciences.

[8]  Y. Teh,et al.  A Unified Stochastic Gradient Approach to Designing Bayesian-Optimal Experiments , 2019, AISTATS.

[9]  David S. Greenberg,et al.  Automatic Posterior Transformation for Likelihood-Free Inference , 2019, ICML.

[10]  Alexander A. Alemi,et al.  On Variational Bounds of Mutual Information , 2019, ICML.

[11]  Yee Whye Teh,et al.  Variational Bayesian Optimal Experimental Design , 2019, NeurIPS.

[12]  Michael U. Gutmann,et al.  Efficient Bayesian Experimental Design for Implicit Models , 2018, AISTATS.

[13]  Oriol Vinyals,et al.  Representation Learning with Contrastive Predictive Coding , 2018, ArXiv.

[14]  Jakob H. Macke,et al.  Likelihood-free inference with emulator networks , 2018, AABI.

[15]  Iain Murray,et al.  Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows , 2018, AISTATS.

[16]  S. A. Sisson,et al.  Overview of Approximate Bayesian Computation , 2018, 1802.09720.

[17]  Iain Murray,et al.  Fast $\epsilon$-free Inference of Simulation Models with Bayesian Conditional Density Estimation , 2016, 1605.06376.

[18]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[19]  J. G. Birnberg,et al.  Bayesian Statistics - A Review , 1964 .

[20]  Gilles Louppe,et al.  A Crisis In Simulation-Based Inference? Beware, Your Posterior Approximations Can Be Unfaithful , 2022, Trans. Mach. Learn. Res..

[21]  David Lindley,et al.  Bayesian Statistics, a Review , 1987 .