Generalized Normalizing Flows via Markov Chains
暂无分享,去创建一个
[1] Ullrich Köthe,et al. Analyzing Inverse Problems with Invertible Neural Networks , 2018, ICLR.
[2] Nicola De Cao,et al. Block Neural Autoregressive Flow , 2019, UAI.
[3] G. Parisi. Brownian motion , 2005, Nature.
[4] Pauline Tan,et al. Solving Inverse Problems by Joint Posterior Maximization with Autoencoding Prior , 2021, ArXiv.
[5] Yee Whye Teh,et al. Bayesian Learning via Stochastic Gradient Langevin Dynamics , 2011, ICML.
[6] R. Tweedie,et al. Exponential convergence of Langevin distributions and their discrete approximations , 1996 .
[7] P. Alam. ‘T’ , 2021, Composites Engineering: An A–Z Guide.
[8] Yang Song,et al. Generative Modeling by Estimating Gradients of the Data Distribution , 2019, NeurIPS.
[9] Anthony L. Caterini,et al. Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows , 2019, ICML.
[10] Yang Song,et al. On Maximum Likelihood Training of Score-Based Generative Models , 2021, ArXiv.
[11] B. Anderson. Reverse-time diffusion equation models , 1982 .
[12] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[13] Guigang Zhang,et al. Deep Learning , 2016, Int. J. Semantic Comput..
[14] Gabriele Steidl,et al. Convolutional Proximal Neural Networks and Plug-and-Play Algorithms , 2020, Linear Algebra and its Applications.
[15] Peter Maass,et al. Conditional Invertible Neural Networks for Medical Imaging , 2021, J. Imaging.
[16] Antoine Houdard,et al. Wasserstein Patch Prior for Image Superresolution , 2021, ArXiv.
[17] M. Girolami,et al. Riemann manifold Langevin and Hamiltonian Monte Carlo methods , 2011, Journal of the Royal Statistical Society: Series B (Statistical Methodology).
[18] Hermann Gross,et al. BAYESIAN APPROACH TO THE STATISTICAL INVERSE PROBLEM OF SCATTEROMETRY: COMPARISON OF THREE SURROGATE MODELS , 2015 .
[19] Hermann Gross,et al. Bayesian approach to determine critical dimensions from scatterometric measurements , 2018, Metrologia.
[20] Iain Murray,et al. Masked Autoregressive Flow for Density Estimation , 2017, NIPS.
[21] U. Haussmann,et al. TIME REVERSAL OF DIFFUSIONS , 1986 .
[22] Eldad Haber,et al. An introduction to deep generative modeling , 2021, GAMM-Mitteilungen.
[23] Surya Ganguli,et al. Deep Unsupervised Learning using Nonequilibrium Thermodynamics , 2015, ICML.
[24] Ole Winther,et al. SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows , 2020, NeurIPS.
[25] Gabriele Steidl,et al. Parseval Proximal Neural Networks , 2019, Journal of Fourier Analysis and Applications.
[26] Max Welling,et al. Learning Likelihoods with Conditional Normalizing Flows , 2019, ArXiv.
[27] Yongxin Chen,et al. Diffusion Normalizing Flow , 2021, NeurIPS.
[28] David D L Minh,et al. Nonequilibrium candidate Monte Carlo is an efficient tool for equilibrium simulation , 2011, Proceedings of the National Academy of Sciences.
[29] Aapo Hyvärinen,et al. Estimation of Non-Normalized Statistical Models by Score Matching , 2005, J. Mach. Learn. Res..
[30] Simon Osindero,et al. Conditional Generative Adversarial Nets , 2014, ArXiv.
[31] On the convergence of the Metropolis-Hastings Markov chains , 2013, 1302.0654.
[32] Paul Vicol,et al. Understanding and mitigating exploding inverses in invertible neural networks , 2020, AISTATS.
[33] J. L. Gall,et al. Brownian Motion, Martingales, and Stochastic Calculus , 2016 .
[34] Julien Rabin,et al. Wasserstein Generative Models for Patch-based Texture Synthesis , 2020, SSVM.
[35] Daniel Sheldon,et al. Normalizing Flows Across Dimensions , 2020, ArXiv.
[36] Honglak Lee,et al. Learning Structured Output Representation using Deep Conditional Generative Models , 2015, NIPS.
[37] Jan Kautz,et al. Score-based Generative Modeling in Latent Space , 2021, NeurIPS.
[38] Nicola De Cao,et al. Explorations in Homeomorphic Variational Auto-Encoding , 2018, ArXiv.
[39] Gabriele Steidl,et al. Invertible Neural Networks versus MCMC for Posterior Reconstruction in Grazing Incidence X-Ray Fluorescence , 2021, SSVM.
[40] Konik Kothari,et al. Trumpets: Injective Flows for Inference and Inverse Problems , 2021, UAI.
[41] Shakir Mohamed,et al. Variational Inference with Normalizing Flows , 2015, ICML.
[42] Samy Bengio,et al. Density estimation using Real NVP , 2016, ICLR.
[43] Joachim Weickert,et al. Dithering by Differences of Convex Functions , 2011, SIAM J. Imaging Sci..
[44] Gorjan Alagic,et al. #p , 2019, Quantum information & computation.
[45] David Duvenaud,et al. Invertible Residual Networks , 2018, ICML.
[46] Max Welling,et al. Multiplicative Normalizing Flows for Variational Bayesian Neural Networks , 2017, ICML.
[47] L. Tierney. A note on Metropolis-Hastings kernels for general state spaces , 1998 .
[48] J. Rosenthal,et al. General state space Markov chains and MCMC algorithms , 2004, math/0404033.
[49] Diederik P. Kingma,et al. An Introduction to Variational Autoencoders , 2019, Found. Trends Mach. Learn..
[50] Gabriele Steidl,et al. Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint , 2021, ArXiv.
[51] Ullrich Köthe,et al. Guided Image Generation with Conditional Invertible Neural Networks , 2019, ArXiv.
[52] S. Neumayer,et al. Stabilizing invertible neural networks using mixture models , 2020, Inverse Problems.
[53] David Duvenaud,et al. Residual Flows for Invertible Generative Modeling , 2019, NeurIPS.
[54] Thomas Müller,et al. Neural Importance Sampling , 2018, ACM Trans. Graph..
[55] Iain Murray,et al. Neural Spline Flows , 2019, NeurIPS.
[56] Jean-Christophe Pesquet,et al. Learning Maximally Monotone Operators for Image Recovery , 2020, SIAM J. Imaging Sci..
[57] Abhishek Kumar,et al. Score-Based Generative Modeling through Stochastic Differential Equations , 2020, ICLR.
[58] Jasper Snoek,et al. On the relationship between Normalising Flows and Variational- and Denoising Autoencoders , 2019, DGS@ICLR.
[59] Hao Wu,et al. Stochastic Normalizing Flows , 2020, NeurIPS.
[60] Arnaud Doucet,et al. Annealed Flow Transport Monte Carlo , 2021, ICML.
[61] Alexandre Lacoste,et al. Neural Autoregressive Flows , 2018, ICML.
[62] Max Welling,et al. Auto-Encoding Variational Bayes , 2013, ICLR.
[63] Prafulla Dhariwal,et al. Glow: Generative Flow with Invertible 1x1 Convolutions , 2018, NeurIPS.
[64] Yaoliang Yu,et al. Tails of Lipschitz Triangular Flows , 2020, ICML.
[65] Thomas de Quincey. [C] , 2000, The Works of Thomas De Quincey, Vol. 1: Writings, 1799–1820.
[66] Tsuyoshi Murata,et al. {m , 1934, ACML.
[67] Patrick L. Combettes,et al. Deep Neural Network Structures Solving Variational Inequalities , 2018, Set-Valued and Variational Analysis.