HGAN: Hyperbolic Generative Adversarial Network

Recently, Hyperbolic Spaces in the context of NonEuclidean Deep Learning have gained popularity because of their ability to represent hierarchical data. We propose that it is possible to take advantage of the hierarchical characteristic present in the images by using hyperbolic neural networks in a GAN architecture. In this study, different configurations using fully connected hyperbolic layers in the GAN, CGAN, and WGAN are tested, in what we call the HGAN, HCGAN, and HWGAN, respectively. The results are measured using the Inception Score (IS) and the Fréchet Inception Distance (FID) on the MNIST dataset. Depending on the configuration and space curvature, better results are achieved for each proposed hyperbolic versions than their euclidean counterpart.

[1]  Yee Whye Teh,et al.  Continuous Hierarchical Representations with Poincaré Variational Auto-Encoders , 2019, NeurIPS.

[2]  Shoichiro Yamaguchi,et al.  A Wrapped Normal Distribution on Hyperbolic Space for Gradient-Based Learning , 2019, ICML.

[3]  Frederic Sala,et al.  Learning Mixed-Curvature Representations in Product Spaces , 2018, ICLR.

[4]  Christiane Fellbaum,et al.  Book Reviews: WordNet: An Electronic Lexical Database , 1999, CL.

[5]  Pierre Vandergheynst,et al.  Geometric Deep Learning: Going beyond Euclidean data , 2016, IEEE Signal Process. Mag..

[6]  Mahdi Jalili,et al.  Application of hyperbolic geometry in link prediction of multiplex networks , 2019, Scientific Reports.

[7]  Edwin R. Hancock,et al.  Spherical and Hyperbolic Embeddings of Data , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Rob Fergus,et al.  Visualizing and Understanding Convolutional Networks , 2013, ECCV.

[9]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[10]  Valentin Khrulkov,et al.  Hyperbolic Image Embeddings , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[11]  Aaron C. Courville,et al.  Improved Training of Wasserstein GANs , 2017, NIPS.

[12]  Simon Osindero,et al.  Conditional Generative Adversarial Nets , 2014, ArXiv.

[13]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[14]  Léon Bottou,et al.  Wasserstein Generative Adversarial Networks , 2017, ICML.

[15]  Amin Vahdat,et al.  Hyperbolic Geometry of Complex Networks , 2010, Physical review. E, Statistical, nonlinear, and soft matter physics.

[16]  Sepp Hochreiter,et al.  GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium , 2017, NIPS.

[17]  Gary Bécigneul,et al.  Poincaré GloVe: Hyperbolic Word Embeddings , 2018, ICLR.

[18]  Léon Bottou,et al.  Towards Principled Methods for Training Generative Adversarial Networks , 2017, ICLR.

[19]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[20]  Pascal Vincent,et al.  Representation Learning: A Review and New Perspectives , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  Jure Leskovec,et al.  Hyperbolic Graph Convolutional Neural Networks , 2019, NeurIPS.

[22]  Thomas Hofmann,et al.  Hyperbolic Neural Networks , 2018, NeurIPS.

[23]  Zhe Gan,et al.  APo-VAE: Text Generation in Hyperbolic Space , 2020, NAACL.

[24]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[25]  Lawrence Carin,et al.  ALICE: Towards Understanding Adversarial Learning for Joint Distribution Matching , 2017, NIPS.

[26]  Andrew M. Dai,et al.  Embedding Text in Hyperbolic Spaces , 2018, TextGraphs@NAACL-HLT.

[27]  Douwe Kiela,et al.  Poincaré Embeddings for Learning Hierarchical Representations , 2017, NIPS.

[28]  D. Gans An introduction to non-Euclidean geometry , 1973 .

[29]  Wojciech Zaremba,et al.  Improved Techniques for Training GANs , 2016, NIPS.