暂无分享,去创建一个
[1] Roland Vollgraf,et al. Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms , 2017, ArXiv.
[2] Alexander D'Amour,et al. Reducing Reparameterization Gradient Variance , 2017, NIPS.
[3] David Duvenaud,et al. Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference , 2017, NIPS.
[4] Daan Wierstra,et al. Stochastic Backpropagation and Approximate Inference in Deep Generative Models , 2014, ICML.
[5] David M. Blei,et al. Variational Inference: A Review for Statisticians , 2016, ArXiv.
[6] Yuan Yu,et al. TensorFlow: A system for large-scale machine learning , 2016, OSDI.
[7] Max Welling,et al. Auto-Encoding Variational Bayes , 2013, ICLR.
[8] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[9] Justin Domke,et al. Advances in Black-Box VI: Normalizing Flows, Importance Weighting, and Optimization , 2020, NeurIPS.
[10] Joshua B. Tenenbaum,et al. Human-level concept learning through probabilistic program induction , 2015, Science.
[11] Justin Domke,et al. Approximation Based Variance Reduction for Reparameterization Gradients , 2020, NeurIPS.
[12] Ruslan Salakhutdinov,et al. Importance Weighted Autoencoders , 2015, ICLR.
[13] Shakir Mohamed,et al. Variational Inference with Normalizing Flows , 2015, ICML.
[14] Michael I. Jordan,et al. An Introduction to Variational Methods for Graphical Models , 1999, Machine Learning.
[15] Michael Figurnov,et al. Monte Carlo Gradient Estimation in Machine Learning , 2019, J. Mach. Learn. Res..
[16] David M. Blei,et al. Overdispersed Black-Box Variational Inference , 2016, UAI.
[17] K. Jarrod Millman,et al. Array programming with NumPy , 2020, Nat..
[18] George Tucker,et al. Doubly Reparameterized Gradient Estimators for Monte Carlo Objectives , 2019, ICLR.
[19] Ronald J. Williams,et al. Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning , 2004, Machine Learning.
[20] Yee Whye Teh,et al. Tighter Variational Bounds are Not Necessarily Better , 2018, ICML.