A theory of continuous generative flow networks

Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions over compositional objects. A key limitation of GFlowNets until this time has been that they are restricted to discrete spaces. We present a theory for generalized GFlowNets, which encompasses both existing discrete GFlowNets and ones with continuous or hybrid state spaces, and perform experiments with two goals in mind. First, we illustrate critical points of the theory and the importance of various assumptions. Second, we empirically demonstrate how observations about discrete GFlowNets transfer to the continuous case and show strong results compared to non-GFlowNet baselines on several previously studied tasks. This work greatly widens the perspectives for the application of GFlowNets in probabilistic inference and various modeling settings.

[1]  Jianye Hao,et al.  CFlowNets: Continuous Control with Generative Flow Networks , 2023, ICLR.

[2]  David W. Zhang,et al.  Robust Scheduling with GFlowNets , 2023, ICLR.

[3]  Y. Bengio,et al.  Bayesian learning of Causal Structure and Mechanisms with GFlowNets and Variational Bayes , 2022, ArXiv.

[4]  C. A. Naesseth,et al.  A Variational Perspective on Generative Flow Networks , 2022, Trans. Mach. Learn. Res..

[5]  Aaron C. Courville,et al.  Generative Augmented Flow Networks , 2022, ICLR.

[6]  Ricky T. Q. Chen,et al.  Flow Matching for Generative Modeling , 2022, ICLR.

[7]  Edward J. Hu,et al.  GFlowNets and variational inference , 2022, ICLR.

[8]  Emmanuel Bengio,et al.  Learning GFlowNets from partial episodes for improved convergence and stability , 2022, ICML.

[9]  Ricky T. Q. Chen,et al.  Unifying Generative Models with GFlowNets , 2022, ArXiv.

[10]  T. Jaakkola,et al.  Torsional Diffusion for Molecular Conformer Generation , 2022, NeurIPS.

[11]  Bonaventure F. P. Dossou,et al.  Biological Sequence Design with GFlowNets , 2022, ICML.

[12]  Chris C. Emezue,et al.  Bayesian Structure Learning with Generative Flow Networks , 2022, UAI.

[13]  Aaron C. Courville,et al.  Generative Flow Networks for Discrete Probabilistic Modeling , 2022, ICML.

[14]  Chen Sun,et al.  Trajectory Balance: Improved Credit Assignment in GFlowNets , 2022, NeurIPS.

[15]  Karsten Kreis,et al.  Score-Based Generative Modeling with Critically-Damped Langevin Diffusion , 2021, ICLR.

[16]  Stefano Ermon,et al.  BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery , 2021, NeurIPS.

[17]  Yongxin Chen,et al.  Path Integral Sampler: a stochastic control approach for sampling , 2021, ICLR.

[18]  G. Steidl,et al.  Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint , 2021, SIAM/ASA J. Uncertain. Quantification.

[19]  Doina Precup,et al.  Flow Network based Generative Models for Non-Iterative Diverse Candidate Generation , 2021, NeurIPS.

[20]  Bernhard Scholkopf,et al.  DiBS: Differentiable Bayesian Structure Learning , 2021, NeurIPS.

[21]  Pieter Abbeel,et al.  Denoising Diffusion Probabilistic Models , 2020, NeurIPS.

[22]  Hao Wu,et al.  Stochastic Normalizing Flows , 2020, NeurIPS.

[23]  Yang Song,et al.  Generative Modeling by Estimating Gradients of the Data Distribution , 2019, NeurIPS.

[24]  Vladimir Mironov,et al.  A systematic study of minima in alanine dipeptide , 2018, J. Comput. Chem..

[25]  Surya Ganguli,et al.  Deep Unsupervised Learning using Nonequilibrium Thermodynamics , 2015, ICML.

[26]  W. Coffey,et al.  The Langevin equation : with applications to stochastic problems in physics, chemistry, and electrical engineering , 2012 .

[27]  Andrew Gelman,et al.  The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo , 2011, J. Mach. Learn. Res..

[28]  Radford M. Neal MCMC Using Hamiltonian Dynamics , 2011, 1206.1901.

[29]  Haikady N. Nagaraja,et al.  Inference in Hidden Markov Models , 2006, Technometrics.

[30]  Freda Kemp,et al.  An Introduction to Sequential Monte Carlo Methods , 2003 .

[31]  Radford M. Neal Annealed importance sampling , 1998, Stat. Comput..