FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

A promising class of generative models maps points from a simple distribution to a complex distribution through an invertible neural network. Likelihood-based training of these models requires restricting their architectures to allow cheap computation of Jacobian determinants. Alternatively, the Jacobian trace can be used if the transformation is specified by an ordinary differential equation. In this paper, we use Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density. The result is a continuous-time invertible generative model with unbiased density estimation and one-pass sampling, while allowing unrestricted neural network architectures. We demonstrate our approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

[1]  M. L. Chambers The Mathematical Theory of Optimal Processes , 1965 .

[2]  L. Shampine,et al.  Some practical Runge-Kutta formulas , 1986 .

[3]  M. Hutchinson A stochastic estimator of the trace of the influence matrix for laplacian smoothing splines , 1989 .

[4]  Joel Andersson,et al.  A General-Purpose Software Framework for Dynamic Optimization (Een algemene softwareomgeving voor dynamische optimalisatie) , 2013 .

[5]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[6]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[7]  Shakir Mohamed,et al.  Variational Inference with Normalizing Flows , 2015, ICML.

[8]  Hugo Larochelle,et al.  MADE: Masked Autoencoder for Distribution Estimation , 2015, ICML.

[9]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[10]  Yoshua Bengio,et al.  NICE: Non-linear Independent Components Estimation , 2014, ICLR.

[11]  Koray Kavukcuoglu,et al.  Pixel Recurrent Neural Networks , 2016, ICML.

[12]  Samy Bengio,et al.  Density estimation using Real NVP , 2016, ICLR.

[13]  Iain Murray,et al.  Masked Autoregressive Flow for Density Estimation , 2017, NIPS.

[14]  Max Welling,et al.  Improved Variational Inference with Inverse Autoregressive Flow , 2016, NIPS 2016.

[15]  Barnabás Póczos,et al.  Transformation Autoregressive Networks , 2018, ICML.

[16]  Prafulla Dhariwal,et al.  Glow: Generative Flow with Invertible 1x1 Convolutions , 2018, NeurIPS.

[17]  Ryan P. Adams,et al.  Estimating the Spectral Density of Large Implicit Matrices , 2018, 1802.03451.

[18]  David Duvenaud,et al.  Neural Ordinary Differential Equations , 2018, NeurIPS.

[19]  Max Welling,et al.  Sylvester Normalizing Flows for Variational Inference , 2018, UAI.

[20]  Yuichi Yoshida,et al.  Spectral Normalization for Generative Adversarial Networks , 2018, ICLR.

[21]  Alexandre Lacoste,et al.  Neural Autoregressive Flows , 2018, ICML.