Conditional score-based diffusion models for Bayesian inference in infinite dimensions

Since their initial introduction, score-based diffusion models (SDMs) have been successfully applied to solve a variety of linear inverse problems in finite-dimensional vector spaces due to their ability to efficiently approximate the posterior distribution. However, using SDMs for inverse problems in infinite-dimensional function spaces has only been addressed recently, primarily through methods that learn the unconditional score. While this approach is advantageous for some inverse problems, it is mostly heuristic and involves numerous computationally costly forward operator evaluations during posterior sampling. To address these limitations, we propose a theoretically grounded method for sampling from the posterior of infinite-dimensional Bayesian linear inverse problems based on amortized conditional SDMs. In particular, we prove that one of the most successful approaches for estimating the conditional score in finite dimensions - the conditional denoising estimator - can also be applied in infinite dimensions. A significant part of our analysis is dedicated to demonstrating that extending infinite-dimensional SDMs to the conditional setting requires careful consideration, as the conditional score typically blows up for small times, contrarily to the unconditional score. We conclude by presenting stylized and large-scale numerical examples that validate our approach, offer additional insights, and demonstrate that our method enables large-scale, discretization-invariant Bayesian inference.

[1]  F. Herrmann,et al.  Adjoint operators enable fast and amortized machine learning based Bayesian uncertainty quantification , 2023, Medical Imaging.

[2]  M. Filippone,et al.  Continuous-Time Functional Diffusion Processes , 2023, NeurIPS.

[3]  Jae Hyun Lim,et al.  Score-based Diffusion Models in Function Space , 2023, ArXiv.

[4]  Padhraic Smyth,et al.  Diffusion Generative Models in Infinite Dimensions , 2022, AISTATS.

[5]  Valentin De Bortoli,et al.  Spectral Diffusion Processes , 2022, ArXiv.

[6]  Holden Lee,et al.  Convergence of score-based generative modeling for general data distributions , 2022, ALT.

[7]  Anru R. Zhang,et al.  Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions , 2022, ICLR.

[8]  Valentin De Bortoli Convergence of denoising diffusion models under the manifold hypothesis , 2022, Trans. Mach. Learn. Res..

[9]  F. Herrmann,et al.  Reliable amortized variational inference with physics-based latent distribution correction , 2022, GEOPHYSICS.

[10]  Alan D. Saul,et al.  Neural Diffusion Processes , 2022, ICML.

[11]  F. Herrmann,et al.  Wave-equation-based inversion with amortized variational Bayesian inference , 2022, 2203.15881.

[12]  Rachel C. Kurchin,et al.  Score-Based Generative Models for Molecule Generation , 2022, ArXiv.

[13]  Danilo Jimenez Rezende,et al.  From data to functa: Your data point is a function and you can treat it like one , 2022, ICML.

[14]  Karsten Kreis,et al.  Score-Based Generative Modeling with Critically-Damped Langevin Diffusion , 2021, ICLR.

[15]  Christian Etmann,et al.  Conditional Image Generation with Score-Based Diffusion Models , 2021, ArXiv.

[16]  S. Ermon,et al.  Solving Inverse Problems in Medical Imaging with Score-Based Generative Models , 2021, ICLR.

[17]  Alexandros G. Dimakis,et al.  Robust Compressed Sensing MRI with Deep Generative Priors , 2021, NeurIPS.

[18]  Il-Chul Moon,et al.  Soft Truncation: A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation , 2021, ICML.

[19]  Michael Elad,et al.  SNIPS: Solving Noisy Inverse Problems Stochastically , 2021, NeurIPS.

[20]  Prafulla Dhariwal,et al.  Diffusion Models Beat GANs on Image Synthesis , 2021, NeurIPS.

[21]  Felix J. Herrmann,et al.  Learning by example: fast reliability-aware seismic imaging with normalizing flows , 2021, First International Meeting for Applied Geoscience & Energy Expanded Abstracts.

[22]  Abhishek Kumar,et al.  Score-Based Generative Modeling through Stochastic Differential Equations , 2020, ICLR.

[23]  Nikola B. Kovachki,et al.  Fourier Neural Operator for Parametric Partial Differential Equations , 2020, ICLR.

[24]  Ricardo Baptista,et al.  An adaptive transport framework for joint and conditional density estimation , 2020, ArXiv.

[25]  Pieter Abbeel,et al.  Denoising Diffusion Probabilistic Models , 2020, NeurIPS.

[26]  Nikola B. Kovachki,et al.  Conditional Sampling with Monotone GANs: from Generative Models to Likelihood-Free Inference , 2020, 2006.06755.

[27]  Stefan T. Radev,et al.  BayesFlow: Learning Complex Stochastic Models With Invertible Neural Networks , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[28]  Gilles Louppe,et al.  The frontier of simulation-based inference , 2019, Proceedings of the National Academy of Sciences.

[29]  Yang Song,et al.  Generative Modeling by Estimating Gradients of the Data Distribution , 2019, NeurIPS.

[30]  Ullrich Köthe,et al.  HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference , 2019, AAAI.

[31]  Felix J. Herrmann,et al.  Devito: an embedded domain-specific language for finite differences and geophysical exploration , 2018, Geoscientific Model Development.

[32]  Philipp A. Witte,et al.  Architecture and Performance of Devito, a System for Automated Stencil Computation , 2018, ACM Trans. Math. Softw..

[33]  Yi Zhang,et al.  Do GANs learn the distribution? Some Theory and Empirics , 2018, ICLR.

[34]  Alexander M. Rush,et al.  Semi-Amortized Variational Autoencoders , 2018, ICML.

[35]  Surya Ganguli,et al.  Deep Unsupervised Learning using Nonequilibrium Thermodynamics , 2015, ICML.

[36]  Bangti Jin,et al.  Inverse Problems , 2014, Series on Applied Mathematics.

[37]  Gunther Uhlmann,et al.  The inverse problem for the local geodesic ray transform , 2012, 1210.2084.

[38]  Pascal Vincent,et al.  A Connection Between Score Matching and Denoising Autoencoders , 2011, Neural Computation.

[39]  A. V. D. Vaart,et al.  BAYESIAN INVERSE PROBLEMS WITH GAUSSIAN PRIORS , 2011, 1103.2692.

[40]  Andrew M. Stuart,et al.  Inverse problems: A Bayesian perspective , 2010, Acta Numerica.

[41]  S. Lasanen,et al.  Measurements and infinite‐dimensional statistical inverse theory , 2007 .

[42]  G. Prato An Introduction to Infinite-Dimensional Analysis , 2006 .

[43]  Aapo Hyvärinen,et al.  Estimation of Non-Normalized Statistical Models by Score Matching , 2005, J. Mach. Learn. Res..

[44]  Albert Tarantola,et al.  Inverse problem theory - and methods for model parameter estimation , 2004 .

[45]  D. Menemenlis Inverse Modeling of the Ocean and Atmosphere , 2002 .

[46]  G. Schuster,et al.  Least-squares migration of incomplete reflection data , 1999 .

[47]  Petros G. Voulgaris,et al.  On optimal ℓ∞ to ℓ∞ filtering , 1995, Autom..

[48]  J. Virieux,et al.  Iterative asymptotic inversion in the acoustic approximation , 1992 .

[49]  D. Nualart,et al.  Time reversal for infinite-dimensional diffusions , 1989 .

[50]  Erkki Somersalo,et al.  Linear inverse problems for generalised random variables , 1989 .

[51]  H. Föllmer,et al.  Time reversal of infinite-dimensional diffusions , 1986 .

[52]  B. Anderson Reverse-time diffusion equation models , 1982 .

[53]  A. Dynin Inversion problem for singular integral operators: C-approach. , 1978, Proceedings of the National Academy of Sciences of the United States of America.

[54]  J. Hadamard,et al.  Lectures on Cauchy's Problem in Linear Partial Differential Equations , 1924 .

[55]  J. Mercer Functions of Positive and Negative Type, and their Connection with the Theory of Integral Equations , 1909 .

[56]  Y. Marzouk,et al.  Infinite-Dimensional Diffusion Models for Function Spaces , 2023, ArXiv.

[57]  A. Stuart Uncertainty Quantification in Bayesian Inversion , 2014 .

[58]  Clifford H. Thurber,et al.  Parameter estimation and inverse problems , 2005 .

[59]  HighWire Press Philosophical transactions of the Royal Society of London. Series A, Containing papers of a mathematical or physical character , 1896 .