FIS-GAN: GAN with Flow-based Importance Sampling

Generative Adversarial Networks (GAN) training process, in most cases, apply uniform and Gaussian sampling methods in latent space, which probably spends most of the computation on examples that can be properly handled and easy to generate. Theoretically, importance sampling speeds up stochastic gradient algorithms for supervised learning by prioritizing training examples. In this paper, we explore the possibility for adapting importance sampling into adversarial learning. We use importance sampling to replace uniform and Gaussian sampling methods in latent space and combine normalizing flow with importance sampling to approximate latent space posterior distribution by density estimation. Empirically, results on MNIST and Fashion-MNIST demonstrate that our method significantly accelerates the convergence of generative process while retaining visual fidelity in generated samples.

[1]  Jeff Donahue,et al.  Large Scale GAN Training for High Fidelity Natural Image Synthesis , 2018, ICLR.

[2]  Zhipeng Wang,et al.  Nonparametric density estimation for high‐dimensional data—Algorithms and applications , 2019, WIREs Computational Statistics.

[3]  Yang Song,et al.  MintNet: Building Invertible Neural Networks with Masked Convolutions , 2019, NeurIPS.

[4]  Prafulla Dhariwal,et al.  Glow: Generative Flow with Invertible 1x1 Convolutions , 2018, NeurIPS.

[5]  Samy Bengio,et al.  Density estimation using Real NVP , 2016, ICLR.

[6]  Sebastian Nowozin,et al.  The Numerics of GANs , 2017, NIPS.

[7]  Tyler B. Johnson,et al.  Training Deep Models Faster with Robust, Approximate Importance Sampling , 2018, NeurIPS.

[8]  Yuichi Yoshida,et al.  Spectral Normalization for Generative Adversarial Networks , 2018, ICLR.

[9]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[10]  Stefano Ermon,et al.  Flow-GAN: Combining Maximum Likelihood and Adversarial Learning in Generative Models , 2017, AAAI.

[11]  François Fleuret,et al.  Not All Samples Are Created Equal: Deep Learning with Importance Sampling , 2018, ICML.

[12]  Nicola De Cao,et al.  Block Neural Autoregressive Flow , 2019, UAI.

[13]  Yoshua Bengio,et al.  Generative Adversarial Networks , 2014, ArXiv.

[14]  Iain Murray,et al.  Neural Spline Flows , 2019, NeurIPS.

[15]  Alexandre Lacoste,et al.  Neural Autoregressive Flows , 2018, ICML.

[16]  Léon Bottou,et al.  Wasserstein Generative Adversarial Networks , 2017, ICML.

[17]  David Duvenaud,et al.  Invertible Residual Networks , 2018, ICML.

[18]  Pieter Abbeel,et al.  Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design , 2019, ICML.

[19]  Max Welling,et al.  Improved Variational Inference with Inverse Autoregressive Flow , 2016, NIPS 2016.

[20]  Aaron C. Courville,et al.  Improved Training of Wasserstein GANs , 2017, NIPS.

[21]  David Duvenaud,et al.  Residual Flows for Invertible Generative Modeling , 2019, NeurIPS.

[22]  Yoshua Bengio,et al.  NICE: Non-linear Independent Components Estimation , 2014, ICLR.