Deep Generative Modelling: A Comparative Review of VAEs, GANs, Normalizing Flows, Energy-Based and Autoregressive Models
暂无分享,去创建一个
[1] Nicola De Cao,et al. Block Neural Autoregressive Flow , 2019, UAI.
[2] Gordon Wetzstein,et al. Implicit Neural Representations with Periodic Activation Functions , 2020, NeurIPS.
[3] Iain Murray,et al. Cubic-Spline Flows , 2019, ICML 2019.
[4] Mark Chen,et al. Language Models are Few-Shot Learners , 2020, NeurIPS.
[5] Sergey Ioffe,et al. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.
[6] Graham Neubig,et al. Lagging Inference Networks and Posterior Collapse in Variational Autoencoders , 2019, ICLR.
[7] Rémi Munos,et al. Autoregressive Quantile Networks for Generative Modeling , 2018, ICML.
[8] Oriol Vinyals,et al. Neural Discrete Representation Learning , 2017, NIPS.
[9] Shiyu Chang,et al. TransGAN: Two Transformers Can Make One Strong GAN , 2021, ArXiv.
[10] Yoshua Bengio,et al. Small-GAN: Speeding Up GAN Training Using Core-sets , 2019, ICML.
[11] Yang Lu,et al. Cooperative Training of Descriptor and Generator Networks , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[12] Ilya Sutskever,et al. Generating Long Sequences with Sparse Transformers , 2019, ArXiv.
[13] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[14] Heiga Zen,et al. Parallel WaveNet: Fast High-Fidelity Speech Synthesis , 2017, ICML.
[15] Razvan Pascanu,et al. A RAD approach to deep mixture models , 2019, DGS@ICLR.
[16] David Duvenaud,et al. FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models , 2018, ICLR.
[17] Yee Whye Teh,et al. Bayesian Learning via Stochastic Gradient Langevin Dynamics , 2011, ICML.
[18] Aapo Hyvärinen,et al. Estimation of Non-Normalized Statistical Models by Score Matching , 2005, J. Mach. Learn. Res..
[19] Yang Song,et al. Sliced Score Matching: A Scalable Approach to Density and Score Estimation , 2019, UAI.
[20] Richard E. Turner,et al. Two problems with variational expectation maximisation for time-series models , 2011 .
[21] Eric Nalisnick,et al. Normalizing Flows for Probabilistic Modeling and Inference , 2019, J. Mach. Learn. Res..
[22] Maxim Raginsky,et al. Neural Stochastic Differential Equations: Deep Latent Gaussian Models in the Diffusion Limit , 2019, ArXiv.
[23] Jan Kautz,et al. NVAE: A Deep Hierarchical Variational Autoencoder , 2020, NeurIPS.
[24] Dustin Tran,et al. Variational Gaussian Process , 2015, ICLR.
[25] Prafulla Dhariwal,et al. Improved Denoising Diffusion Probabilistic Models , 2021, ICML.
[26] Stefano Ermon,et al. InfoVAE: Balancing Learning and Inference in Variational Autoencoders , 2019, AAAI.
[27] Alexander M. Rush,et al. Latent Normalizing Flows for Discrete Sequences , 2019, ICML.
[28] John P. Cunningham,et al. The continuous Bernoulli: fixing a pervasive error in variational autoencoders , 2019, NeurIPS.
[29] Yoshua Bengio,et al. Generative Adversarial Nets , 2014, NIPS.
[30] Cho-Jui Hsieh,et al. Improving the Speed and Quality of GAN by Adversarial Training , 2020, ArXiv.
[31] David Vázquez,et al. PixelVAE: A Latent Variable Model for Natural Images , 2016, ICLR.
[32] Stefano Ermon,et al. Learning Hierarchical Features from Deep Generative Models , 2017, ICML.
[33] Nal Kalchbrenner,et al. Generating High Fidelity Images with Subscale Pixel Networks and Multidimensional Upscaling , 2018, ICLR.
[34] C. Villani. Optimal Transport: Old and New , 2008 .
[35] Yee Whye Teh,et al. Generative Models as Distributions of Functions , 2021, ArXiv.
[36] Xingjian Li,et al. OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal Transport , 2020, ArXiv.
[37] Michael I. Jordan,et al. An Introduction to Variational Methods for Graphical Models , 1999, Machine Learning.
[38] Lukasz Kaiser,et al. Rethinking Attention with Performers , 2020, ArXiv.
[39] Tian Han,et al. Learning Latent Space Energy-Based Prior Model , 2020, NeurIPS.
[40] Jakub M. Tomczak,et al. Variational Inference with Orthogonal Normalizing Flows , 2017 .
[41] Max Welling,et al. VAE with a VampPrior , 2017, AISTATS.
[42] Sepp Hochreiter,et al. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium , 2017, NIPS.
[43] Sergio Gomez Colmenarejo,et al. Parallel Multiscale Autoregressive Density Estimation , 2017, ICML.
[44] Truyen Tran,et al. Catastrophic forgetting and mode collapse in GANs , 2020, 2020 International Joint Conference on Neural Networks (IJCNN).
[45] Jason Yosinski,et al. Metropolis-Hastings Generative Adversarial Networks , 2018, ICML.
[46] Stefano Ermon,et al. Improved Autoregressive Modeling with Distribution Smoothing , 2021, ICLR.
[47] Erik Nijkamp,et al. Learning Non-Convergent Non-Persistent Short-Run MCMC Toward Energy-Based Model , 2019, NeurIPS.
[48] Erik Nijkamp,et al. Learning Energy-based Model with Flow-based Backbone by Neural Transport MCMC , 2020, ArXiv.
[49] Arthur Gretton,et al. Demystifying MMD GANs , 2018, ICLR.
[50] David P. Wipf,et al. Diagnosing and Enhancing VAE Models , 2019, ICLR.
[51] Thomas Brox,et al. Generating Images with Perceptual Similarity Metrics based on Deep Networks , 2016, NIPS.
[52] Alexander M. Rush,et al. Semi-Amortized Variational Autoencoders , 2018, ICML.
[53] Pascal Vincent,et al. Generalized Denoising Auto-Encoders as Generative Models , 2013, NIPS.
[54] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[55] Cho-Jui Hsieh,et al. Rob-GAN: Generator, Discriminator, and Adversarial Attacker , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[56] Simon Osindero,et al. Conditional Generative Adversarial Nets , 2014, ArXiv.
[57] Nando de Freitas,et al. On Autoencoders and Score Matching for Energy Based Models , 2011, ICML.
[58] Ryan Prenger,et al. Waveglow: A Flow-based Generative Network for Speech Synthesis , 2018, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[59] Lantao Yu,et al. Training Deep Energy-Based Models with f-Divergence Minimization , 2020, ICML.
[60] Mohammad Norouzi,et al. Dream to Control: Learning Behaviors by Latent Imagination , 2019, ICLR.
[61] Yoshua Bengio,et al. Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling , 2020, NeurIPS.
[62] Charlie Nash,et al. Autoregressive Energy Machines , 2019, ICML.
[63] Tero Karras,et al. Training Generative Adversarial Networks with Limited Data , 2020, NeurIPS.
[64] Léon Bottou,et al. Towards Principled Methods for Training Generative Adversarial Networks , 2017, ICLR.
[65] Fu Jie Huang,et al. A Tutorial on Energy-Based Learning , 2006 .
[66] Koray Kavukcuoglu,et al. Pixel Recurrent Neural Networks , 2016, ICML.
[67] Iain Murray,et al. Neural Spline Flows , 2019, NeurIPS.
[68] Yann LeCun,et al. Energy-based Generative Adversarial Network , 2016, ICLR.
[69] Ilya Sutskever,et al. On the Convergence Properties of Contrastive Divergence , 2010, AISTATS.
[70] David Duvenaud,et al. Residual Flows for Invertible Generative Modeling , 2019, NeurIPS.
[71] Yang Song,et al. MintNet: Building Invertible Neural Networks with Masked Convolutions , 2019, NeurIPS.
[72] Aapo Hyvärinen,et al. Noise-contrastive estimation: A new estimation principle for unnormalized statistical models , 2010, AISTATS.
[73] R. Tweedie,et al. Exponential convergence of Langevin distributions and their discrete approximations , 1996 .
[74] Wojciech Zaremba,et al. Improved Techniques for Training GANs , 2016, NIPS.
[75] Adam M. Oberman,et al. How to Train Your Neural ODE: the World of Jacobian and Kinetic Regularization , 2020, ICML.
[76] Max Welling,et al. Markov Chain Monte Carlo and Variational Inference: Bridging the Gap , 2014, ICML.
[77] Thomas Müller,et al. Neural Importance Sampling , 2018, ACM Trans. Graph..
[78] Ole Winther,et al. BIVA: A Very Deep Hierarchy of Latent Variables for Generative Modeling , 2019, NeurIPS.
[79] Zhenan Sun,et al. A Review on Generative Adversarial Networks: Algorithms, Theory, and Applications , 2020, IEEE Transactions on Knowledge and Data Engineering.
[80] Timo Aila,et al. A Style-Based Generator Architecture for Generative Adversarial Networks , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[81] Ruslan Salakhutdinov,et al. Importance Weighted Autoencoders , 2015, ICLR.
[82] Ole Winther,et al. Ladder Variational Autoencoders , 2016, NIPS.
[83] Stefano Ermon,et al. A-NICE-MC: Adversarial Training for MCMC , 2017, NIPS.
[84] Tian Han,et al. Alternating Back-Propagation for Generator Network , 2016, AAAI.
[85] Surya Ganguli,et al. Deep Unsupervised Learning using Nonequilibrium Thermodynamics , 2015, ICML.
[86] Zengyi Li,et al. Learning Energy-Based Models in High-Dimensional Spaces with Multi-scale Denoising Score Matching , 2019, 1910.07762.
[87] Guodong Zhang,et al. On Solving Minimax Optimization Locally: A Follow-the-Ridge Approach , 2019, ICLR.
[88] Max Welling,et al. Sylvester Normalizing Flows for Variational Inference , 2018, UAI.
[89] Ying Nian Wu,et al. Learning Energy-Based Models by Diffusion Recovery Likelihood , 2020, ICLR.
[90] Alexia Jolicoeur-Martineau,et al. The relativistic discriminator: a key element missing from standard GAN , 2018, ICLR.
[91] Jascha Sohl-Dickstein,et al. Invertible Convolutional Flow , 2019, NeurIPS.
[92] Aaron C. Courville,et al. Augmented Normalizing Flows: Bridging the Gap Between Generative Flows and Latent Variable Models , 2020, ArXiv.
[93] Paul Babyn,et al. Generative Adversarial Network in Medical Imaging: A Review , 2018, Medical Image Anal..
[94] Raymond Y. K. Lau,et al. Least Squares Generative Adversarial Networks , 2016, 2017 IEEE International Conference on Computer Vision (ICCV).
[95] Guoping Qiu,et al. Spectral regularization for combating mode collapse in GANs , 2020, Image Vis. Comput..
[96] Jonathan T. Barron,et al. Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains , 2020, NeurIPS.
[97] Yee Whye Teh,et al. A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.
[98] Yang Lu,et al. A Theory of Generative ConvNet , 2016, ICML.
[99] Diederik P. Kingma,et al. How to Train Your Energy-Based Models , 2021, ArXiv.
[100] L. Duan. Transport Monte Carlo , 2019, 1907.10448.
[101] Sergey Levine,et al. Stochastic Adversarial Video Prediction , 2018, ArXiv.
[102] Shakir Mohamed,et al. Variational Inference with Normalizing Flows , 2015, ICML.
[103] Samy Bengio,et al. Density estimation using Real NVP , 2016, ICLR.
[104] Han Fang,et al. Linformer: Self-Attention with Linear Complexity , 2020, ArXiv.
[105] C. Villani. Topics in Optimal Transportation , 2003 .
[106] Alex Graves,et al. DRAW: A Recurrent Neural Network For Image Generation , 2015, ICML.
[107] Frederick R. Forst,et al. On robust estimation of the location parameter , 1980 .
[108] Yuichi Yoshida,et al. Spectral Normalization for Generative Adversarial Networks , 2018, ICLR.
[109] Yaoliang Yu,et al. Sum-of-Squares Polynomial Flow , 2019, ICML.
[110] Oliver Wang,et al. MSG-GAN: Multi-Scale Gradients for Generative Adversarial Networks , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[111] VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models , 2020, ArXiv.
[112] Jeff Donahue,et al. Large Scale GAN Training for High Fidelity Natural Image Synthesis , 2018, ICLR.
[113] Sameer Singh,et al. Image Augmentations for GAN Training , 2020, ArXiv.
[114] Joshua V. Dillon,et al. NeuTra-lizing Bad Geometry in Hamiltonian Monte Carlo Using Neural Transport , 2019, 1903.03704.
[115] Michael Burke,et al. DepthwiseGANs: Fast Training Generative Adversarial Networks for Realistic Image Synthesis , 2019, 2019 Southern African Universities Power Engineering Conference/Robotics and Mechatronics/Pattern Recognition Association of South Africa (SAUPEC/RobMech/PRASA).
[116] Pieter Abbeel,et al. Variational Lossy Autoencoder , 2016, ICLR.
[117] Yang Song,et al. Generative Modeling by Estimating Gradients of the Data Distribution , 2019, NeurIPS.
[118] Ole Winther,et al. SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows , 2020, NeurIPS.
[119] M. L. Chambers. The Mathematical Theory of Optimal Processes , 1965 .
[120] David Duvenaud,et al. Neural Ordinary Differential Equations , 2018, NeurIPS.
[121] Bernhard Schölkopf,et al. Deep Energy Estimator Networks , 2018, ArXiv.
[122] Aleksander Madry,et al. Towards Deep Learning Models Resistant to Adversarial Attacks , 2017, ICLR.
[123] Rewon Child. Very Deep VAEs Generalize Autoregressive Models and Can Outperform Them on Images , 2021, ICLR.
[124] David Duvenaud,et al. Invertible Residual Networks , 2018, ICML.
[125] Xi Chen,et al. PixelCNN++: Improving the PixelCNN with Discretized Logistic Mixture Likelihood and Other Modifications , 2017, ICLR.
[126] Geoffrey E. Hinton,et al. A New Learning Algorithm for Mean Field Boltzmann Machines , 2002, ICANN.
[127] Song Han,et al. Differentiable Augmentation for Data-Efficient GAN Training , 2020, NeurIPS.
[128] Abhishek Kumar,et al. Score-Based Generative Modeling through Stochastic Differential Equations , 2020, ICLR.
[129] Yoshua Bengio,et al. MelGAN: Generative Adversarial Networks for Conditional Waveform Synthesis , 2019, NeurIPS.
[130] Mohammad Norouzi,et al. No MCMC for me: Amortized sampling for fast and stable training of energy-based models , 2021, ICLR.
[131] Ngai-Man Cheung,et al. On Data Augmentation for GAN Training , 2020, IEEE Transactions on Image Processing.
[132] Han Zhang,et al. Self-Attention Generative Adversarial Networks , 2018, ICML.
[133] George Em Karniadakis,et al. Potential Flow Generator With L2 Optimal Transport Regularity for Generative Models , 2019, IEEE Transactions on Neural Networks and Learning Systems.
[134] Bernhard Schölkopf,et al. From Variational to Deterministic Autoencoders , 2019, ICLR.
[135] Petros Dellaportas,et al. Gradient-based Adaptive Markov Chain Monte Carlo , 2019, NeurIPS.
[136] Andriy Mnih,et al. The Lipschitz Constant of Self-Attention , 2020, ICML.
[137] Max Welling,et al. Emerging Convolutions for Generative Normalizing Flows , 2019, ICML.
[138] Rob Fergus,et al. Deep Generative Image Models using a Laplacian Pyramid of Adversarial Networks , 2015, NIPS.
[139] Dmitry Vetrov,et al. The Implicit Metropolis-Hastings Algorithm , 2019, NeurIPS.
[140] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[141] Yee Whye Teh,et al. Augmented Neural ODEs , 2019, NeurIPS.
[142] Jiaming Song,et al. Denoising Diffusion Implicit Models , 2021, ICLR.
[143] Richard Zemel,et al. Learning the Stein Discrepancy for Training and Evaluating Energy-Based Models without Sampling , 2020, ICML.
[144] Pieter Abbeel,et al. Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design , 2019, ICML.
[145] Tian Han,et al. On the Anatomy of MCMC-based Maximum Likelihood Learning of Energy-Based Models , 2019, AAAI.
[146] Lucas Theis,et al. Amortised MAP Inference for Image Super-resolution , 2016, ICLR.
[147] Kumar Krishna Agrawal,et al. Discrete Flows: Invertible Generative Models of Discrete Data , 2019, DGS@ICLR.
[148] Jakub M. Tomczak,et al. The Convolution Exponential and Generalized Sylvester Flows , 2020, NeurIPS.
[149] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[150] P. J. Huber. Robust Estimation of a Location Parameter , 1964 .
[151] Igor Mordatch,et al. Implicit Generation and Generalization with Energy Based Models , 2018 .
[152] Xiaohua Zhai,et al. Self-Supervised GANs via Auxiliary Rotation Loss , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[153] Sergey Levine,et al. Stochastic Variational Video Prediction , 2017, ICLR.
[154] Mark Chen,et al. Distribution Augmentation for Generative Modeling , 2020, ICML.
[155] Ryan P. Adams,et al. SUMO: Unbiased Estimation of Log Marginal Probability for Latent Variable Models , 2020, ICLR.
[156] Geoffrey E. Hinton. Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.
[157] Alexandre Lacoste,et al. Neural Autoregressive Flows , 2018, ICML.
[158] Nikolaos Pappas,et al. Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention , 2020, ICML.
[159] David Lopez-Paz,et al. Optimizing the Latent Space of Generative Networks , 2017, ICML.
[160] Yoshua Bengio,et al. NICE: Non-linear Independent Components Estimation , 2014, ICLR.
[161] Jaakko Lehtinen,et al. Progressive Growing of GANs for Improved Quality, Stability, and Variation , 2017, ICLR.
[162] Soumith Chintala,et al. Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.
[163] Jun Zhu,et al. A Spectral Approach to Gradient Estimation for Implicit Distributions , 2018, ICML.
[164] Yoshua Bengio,et al. Deep Directed Generative Models with Energy-Based Probability Estimation , 2016, ArXiv.
[165] Yoshua Bengio,et al. SampleRNN: An Unconditional End-to-End Neural Audio Generation Model , 2016, ICLR.
[166] Aaron C. Courville,et al. Improved Training of Wasserstein GANs , 2017, NIPS.
[167] Jun Zhu,et al. VFlow: More Expressive Generative Flows with Variational Data Augmentation , 2020, ICML.
[168] Colin Raffel,et al. Towards GAN Benchmarks Which Require Generalization , 2020, ICLR.
[169] Jaakko Lehtinen,et al. Analyzing and Improving the Image Quality of StyleGAN , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[170] Mohammad Havaei,et al. Learnable Explicit Density for Continuous Latent Space and Variational Inference , 2017, ArXiv.
[171] Iain Murray,et al. On Contrastive Learning for Likelihood-free Inference , 2020, ICML.
[172] Erik Nijkamp,et al. Learning Multi-layer Latent Variable Model via Variational Optimization of Short Run MCMC for Approximate Inference , 2019, ECCV.
[173] Chris G. Willcocks,et al. Gradient Origin Networks , 2020, ArXiv.
[174] Ivan Kobyzev,et al. Normalizing Flows: An Introduction and Review of Current Methods , 2020, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[175] Ilya Sutskever,et al. Zero-Shot Text-to-Image Generation , 2021, ICML.
[176] Eduard H. Hovy,et al. MAE: Mutual Posterior-Divergence Regularization for Variational AutoEncoders , 2019, ICLR.
[177] Pascal Vincent,et al. A Connection Between Score Matching and Denoising Autoencoders , 2011, Neural Computation.
[178] Navdeep Jaitly,et al. Adversarial Autoencoders , 2015, ArXiv.
[179] Dustin Tran,et al. Image Transformer , 2018, ICML.
[180] François Chollet,et al. Xception: Deep Learning with Depthwise Separable Convolutions , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[181] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[182] Achraf Oussidi,et al. Deep generative models: Survey , 2018, 2018 International Conference on Intelligent Systems and Computer Vision (ISCV).
[183] Prafulla Dhariwal,et al. Glow: Generative Flow with Invertible 1x1 Convolutions , 2018, NeurIPS.
[184] Alex Graves,et al. Conditional Image Generation with PixelCNN Decoders , 2016, NIPS.
[185] Ali Razavi,et al. Generating Diverse High-Fidelity Images with VQ-VAE-2 , 2019, NeurIPS.
[186] Mario Lucic,et al. Are GANs Created Equal? A Large-Scale Study , 2017, NeurIPS.
[187] Stefano Ermon,et al. Towards Deeper Understanding of Variational Autoencoding Models , 2017, ArXiv.
[188] Miguel Á. Carreira-Perpiñán,et al. On Contrastive Divergence Learning , 2005, AISTATS.
[189] Lantao Yu,et al. Autoregressive Score Matching , 2020, NeurIPS.
[190] Tie-Yan Liu,et al. Discriminator Contrastive Divergence: Semi-Amortized Generative Modeling by Exploring Energy of the Discriminator , 2020, ArXiv.
[191] Mohammad Norouzi,et al. Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One , 2019, ICLR.
[192] Matthias Bethge,et al. A note on the evaluation of generative models , 2015, ICLR.
[193] Samy Bengio,et al. Generating Sentences from a Continuous Space , 2015, CoNLL.
[194] Sebastian Nowozin,et al. f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization , 2016, NIPS.
[195] Nina Narodytska,et al. RelGAN: Relational Generative Adversarial Networks for Text Generation , 2019, ICLR.
[196] David J. Fleet,et al. Exemplar VAE: Linking Generative Models, Nearest Neighbor Retrieval, and Data Augmentation , 2020, NeurIPS.
[197] Jun Zhu,et al. Implicit Normalizing Flows , 2021, ICLR.
[198] Yizhe Zhu,et al. Towards Faster and Stabilized GAN Training for High-fidelity Few-shot Image Synthesis , 2021, ICLR.
[199] Ferenc Huszar,et al. How (not) to Train your Generative Model: Scheduled Sampling, Likelihood, Adversary? , 2015, ArXiv.
[200] Dimitris N. Metaxas,et al. StackGAN: Text to Photo-Realistic Image Synthesis with Stacked Generative Adversarial Networks , 2016, 2017 IEEE International Conference on Computer Vision (ICCV).
[201] Geoffrey E. Hinton,et al. OPTIMAL PERCEPTUAL INFERENCE , 1983 .
[202] Bernhard Pfahringer,et al. Regularisation of neural networks by enforcing Lipschitz continuity , 2018, Machine Learning.
[203] Max Welling,et al. Improved Variational Inference with Inverse Autoregressive Flow , 2016, NIPS 2016.
[204] Jonathon Shlens,et al. Conditional Image Synthesis with Auxiliary Classifier GANs , 2016, ICML.
[205] John E. Hopcroft,et al. Stacked Generative Adversarial Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[206] Patrick Esser,et al. Taming Transformers for High-Resolution Image Synthesis , 2021, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[207] Iain Murray,et al. Masked Autoregressive Flow for Density Estimation , 2017, NIPS.
[208] Heiga Zen,et al. WaveNet: A Generative Model for Raw Audio , 2016, SSW.
[209] Yoshua Bengio,et al. Maximum Entropy Generators for Energy-Based Models , 2019, ArXiv.
[210] Ngai-Man Cheung,et al. Self-supervised GAN: Analysis and Improvement with Multi-class Minimax Game , 2019, NeurIPS.
[211] Zengyi Li,et al. A Neural Network MCMC Sampler That Maximizes Proposal Entropy , 2020, Entropy.
[212] Yoshua Bengio,et al. Estimating or Propagating Gradients Through Stochastic Neurons for Conditional Computation , 2013, ArXiv.
[213] E Weinan,et al. Monge-Ampère Flow for Generative Modeling , 2018, ArXiv.
[214] Maxim Raginsky,et al. Theoretical guarantees for sampling and inference in generative models with latent diffusions , 2019, COLT.
[215] Arthur Gretton,et al. Generalized Energy Based Models , 2020, ICLR.
[216] Andrew M. Dai,et al. Flow Contrastive Estimation of Energy-Based Models , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[217] Eduard H. Hovy,et al. MaCow: Masked Convolutional Generative Flow , 2019, NeurIPS.
[218] Anthony L. Caterini,et al. Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows , 2019, ICML.