Latent Variable Modelling with Hyperbolic Normalizing Flows

The choice of approximate posterior distributions plays a central role in stochastic variational inference (SVI). One effective solution is the use of normalizing flows \cut{defined on Euclidean spaces} to construct flexible posterior distributions. However, one key limitation of existing normalizing flows is that they are restricted to the Euclidean space and are ill-equipped to model data with an underlying hierarchical structure. To address this fundamental limitation, we present the first extension of normalizing flows to hyperbolic spaces. We first elevate normalizing flows to hyperbolic spaces using coupling transforms defined on the tangent bundle, termed Tangent Coupling ($\mathcal{TC}$). We further introduce Wrapped Hyperboloid Coupling ($\mathcal{W}\mathbb{H}C$), a fully invertible and learnable transformation that explicitly utilizes the geometric structure of hyperbolic spaces, allowing for expressive posteriors while being efficient to sample from. We demonstrate the efficacy of our novel normalizing flow over hyperbolic VAEs and Euclidean normalizing flows. Our approach achieves improved performance on density estimation, as well as reconstruction of real-world graph data, which exhibit a hierarchical structure. Finally, we show that our approach can be used to power a generative model over hierarchical data using hyperbolic latent variables.

[1]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[2]  Shakir Mohamed,et al.  Normalizing Flows on Riemannian Manifolds , 2016, ArXiv.

[3]  Tianqi Chen,et al.  Empirical Evaluation of Rectified Activations in Convolutional Network , 2015, ArXiv.

[4]  Shoichiro Yamaguchi,et al.  A Wrapped Normal Distribution on Hyperbolic Space for Gradient-Based Learning , 2019, ICML.

[5]  Douwe Kiela,et al.  Hyperbolic Graph Neural Networks , 2019, NeurIPS.

[6]  Douwe Kiela,et al.  Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry , 2018, ICML.

[7]  Lorenzo Livi,et al.  Adversarial Autoencoders with Constant-Curvature Latent Manifolds , 2019, Appl. Soft Comput..

[8]  M. DePamphilis,et al.  HUMAN DISEASE , 1957, The Ulster Medical Journal.

[9]  Jun Zhu,et al.  Kernel Implicit Variational Inference , 2017, ICLR.

[10]  Rik Sarkar,et al.  Low Distortion Delaunay Embedding of Trees in Hyperbolic Plane , 2011, GD.

[11]  Nicola De Cao,et al.  Hyperspherical Variational Auto-Encoders , 2018, UAI 2018.

[12]  Chong Wang,et al.  Stochastic variational inference , 2012, J. Mach. Learn. Res..

[13]  Samy Bengio,et al.  Density estimation using Real NVP , 2016, ICLR.

[14]  Patrick Forré,et al.  Reparameterizing Distributions on Lie Groups , 2019, AISTATS.

[15]  Renjie Liao,et al.  Lorentzian Distance Learning for Hyperbolic Representations , 2019, ICML.

[16]  Max Welling,et al.  Variational Graph Auto-Encoders , 2016, ArXiv.

[17]  Kyle Cranmer,et al.  Flows for simultaneous manifold learning and density estimation , 2020, NeurIPS.

[18]  Eric Nalisnick,et al.  Normalizing Flows for Probabilistic Modeling and Inference , 2019, J. Mach. Learn. Res..

[19]  David Duvenaud,et al.  FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models , 2018, ICLR.

[20]  Yee Whye Teh,et al.  Continuous Hierarchical Representations with Poincaré Variational Auto-Encoders , 2019, NeurIPS.

[21]  Ruslan Salakhutdinov,et al.  Importance Weighted Autoencoders , 2015, ICLR.

[22]  Xavier Pennec,et al.  Intrinsic Statistics on Riemannian Manifolds: Basic Tools for Geometric Measurements , 2006, Journal of Mathematical Imaging and Vision.

[23]  Razvan Pascanu,et al.  Hyperbolic Attention Networks , 2018, ICLR.

[24]  Renjie Liao,et al.  Efficient Graph Generation with Graph Recurrent Attention Networks , 2019, NeurIPS.

[25]  Ruslan Salakhutdinov,et al.  Revisiting Semi-Supervised Learning with Graph Embeddings , 2016, ICML.

[26]  Shakir Mohamed,et al.  Variational Inference with Normalizing Flows , 2015, ICML.

[27]  Christopher De Sa,et al.  Numerically Accurate Hyperbolic Embeddings Using Tiling-Based Models , 2019, NeurIPS.

[28]  Richard E. Turner,et al.  Gradient Estimators for Implicit Models , 2017, ICLR.

[29]  Nicola De Cao,et al.  Block Neural Autoregressive Flow , 2019, UAI.

[30]  Ivan Kobyzev,et al.  Normalizing Flows: An Introduction and Review of Current Methods , 2020, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[31]  Ivan Ovinnikov,et al.  Poincaré Wasserstein Autoencoder , 2019, ArXiv.

[32]  S. Golomb Polyominoes: Puzzles, Patterns, Problems, and Packings , 1994 .

[33]  Alexandre Lacoste,et al.  Neural Autoregressive Flows , 2018, ICML.

[34]  Siu Cheung Hui,et al.  Hyperbolic Representation Learning for Fast and Efficient Neural Question Answering , 2017, WSDM.

[35]  Bernhard Schölkopf,et al.  Wasserstein Auto-Encoders , 2017, ICLR.

[36]  P. Kaye Infectious diseases of humans: Dynamics and control , 1993 .

[37]  Thomas Hofmann,et al.  Hyperbolic Neural Networks , 2018, NeurIPS.

[38]  Douwe Kiela,et al.  Poincaré Embeddings for Learning Hierarchical Representations , 2017, NIPS.

[39]  Jun Zhu,et al.  A Spectral Approach to Gradient Estimation for Implicit Distributions , 2018, ICML.

[40]  Yannick Berthoumieu,et al.  New Riemannian Priors on the Univariate Normal Model , 2014, Entropy.

[41]  Jure Leskovec,et al.  Hyperbolic Graph Convolutional Neural Networks , 2019, NeurIPS.

[42]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[43]  Gurtej Kanwar,et al.  Normalizing Flows on Tori and Spheres , 2020, ICML.

[44]  Andrew M. Dai,et al.  Embedding Text in Hyperbolic Spaces , 2018, TextGraphs@NAACL-HLT.

[45]  Aviral Kumar,et al.  Graph Normalizing Flows , 2019, NeurIPS.

[46]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[47]  Ivan Kobyzev,et al.  Normalizing Flows: Introduction and Ideas , 2019, ArXiv.

[48]  William Yang Wang,et al.  Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling , 2019, NAACL.

[49]  J. Ratcliffe Foundations of Hyperbolic Manifolds , 2019, Graduate Texts in Mathematics.

[50]  Valentin Khrulkov,et al.  Hyperbolic Image Embeddings , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[51]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[52]  A. Barabasi,et al.  The human disease network , 2007, Proceedings of the National Academy of Sciences.

[53]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.

[54]  Marc Peter Deisenroth,et al.  Neural Embeddings of Graphs in Hyperbolic Space , 2017, ArXiv.