Generative Adversarial Graph Representation Learning in Hyperbolic Space

Representation learning can provide a compact representation of features. There are a large number of representation learning methods which have been successfully applied to feature learning of graph structured data. However, most of the existing graph representation learning methods does not consider the latent hierarchical structure of the data. They mainly focus on high-dimensional Euclidean space for learning. Recent studies have shown that graph structured data are suitable for being embedded in hyperbolic space and that hyperbolic space can be naturally equipped to model hierarchical structures where they outperform Euclidean embeddings. Therefore, we comprehensively consider the hierarchical structure characteristics of graphs, and learn the vector representation of nodes in hyperbolic space by using the principle of generative adversarial learning. By using two models to simultaneously capture hierarchy and similarity and let them compete with each other, the performance of learning is alternately boosted. In this paper, node classification, link prediction and visualization are performed on multiple public datasets. The results show that this method performs well in multiple tasks.

[1]  Lise Getoor,et al.  Collective Classification in Network Data , 2008, AI Mag..

[2]  Andrew McCallum,et al.  Automating the Construction of Internet Portals with Machine Learning , 2000, Information Retrieval.

[3]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[4]  Amin Vahdat,et al.  Hyperbolic Geometry of Complex Networks , 2010, Physical review. E, Statistical, nonlinear, and soft matter physics.

[5]  Mukund Balasubramanian,et al.  The Isomap Algorithm and Topological Stability , 2002, Science.

[6]  Huan Liu,et al.  Leveraging social media networks for classification , 2011, Data Mining and Knowledge Discovery.

[7]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[8]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[9]  David Liben-Nowell,et al.  The link-prediction problem for social networks , 2007 .

[10]  Douwe Kiela,et al.  Poincaré Embeddings for Learning Hierarchical Representations , 2017, NIPS.

[11]  Kevin Chen-Chuan Chang,et al.  A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[12]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[13]  Chih-Jen Lin,et al.  LIBLINEAR: A Library for Large Linear Classification , 2008, J. Mach. Learn. Res..

[14]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[15]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[16]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[17]  Siu Cheung Hui,et al.  Hyperbolic Representation Learning for Fast and Efficient Neural Question Answering , 2017, WSDM.

[18]  Joachim M. Buhmann,et al.  Multidimensional Scaling and Data Clustering , 1994, NIPS.

[19]  Aaron C. Courville,et al.  Improved Training of Wasserstein GANs , 2017, NIPS.

[20]  Huan Liu,et al.  Relational learning via latent social dimensions , 2009, KDD.

[21]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[22]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.