Node2LV: Squared Lorentzian Representations for Node Proximity

Recently, network embedding has attracted extensive research interest. Most existing network embedding models are based on Euclidean spaces. However, Euclidean embedding models cannot effectively capture complex patterns, especially latent hierarchical structures underlying in real-world graphs. Consequently, hyperbolic representation models have been developed to preserve the hierarchical information. Nevertheless, existing hyperbolic models only capture the first-order proximity between nodes. To this end, we propose a new embedding model, named Node2LV, that learns the hyperbolic representations of nodes using squared Lorentzian distances. This yields three advantages. First, our model can effectively capture hierarchical structures that come from the network topology. Second, compared with the conventional hyperbolic embedding methods that use computationally expensive Riemannian gradients, it can be optimized in a more efficient way. Lastly, different from existing hyperbolic embedding models, Node2LV captures higher-order proximities. Specifically, we represent each node with two hyperbolic embeddings, and make the embeddings of related nodes close to each other. To preserve higher-order node proximity, we use a random walk strategy to generate local neighborhood context. We conduct extensive experiments on four different types of real-world networks. Empirical results demonstrate that Node2LV significantly outperforms various graph embedding baselines.

[1]  Albert-László Barabási,et al.  Statistical mechanics of complex networks , 2001, ArXiv.

[2]  Douwe Kiela,et al.  Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry , 2018, ICML.

[3]  Gary Bécigneul,et al.  Poincaré GloVe: Hyperbolic Word Embeddings , 2018, ICLR.

[4]  Albert-László Barabási,et al.  Hierarchical organization in complex networks. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[5]  Gao Cong,et al.  HyperML: A Boosting Metric Learning Approach in Hyperbolic Space for Recommender Systems , 2018, WSDM.

[6]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[7]  Zhiyuan Liu,et al.  Fast Network Embedding Enhancement via High Order Proximity Approximation , 2017, IJCAI.

[8]  Blair D. Sullivan,et al.  Tree-Like Structure in Large Social and Information Networks , 2013, 2013 IEEE 13th International Conference on Data Mining.

[9]  Ryusuke Takahama,et al.  Hyperbolic Disk Embeddings for Directed Acyclic Graphs , 2019, ICML.

[10]  Silvere Bonnabel,et al.  Stochastic Gradient Descent on Riemannian Manifolds , 2011, IEEE Transactions on Automatic Control.

[11]  Gao Cong,et al.  HME: A Hyperbolic Metric Embedding Approach for Next-POI Recommendation , 2020, SIGIR.

[12]  Amin Vahdat,et al.  Hyperbolic Geometry of Complex Networks , 2010, Physical review. E, Statistical, nonlinear, and soft matter physics.

[13]  Christopher De Sa,et al.  Representation Tradeoffs for Hyperbolic Embeddings , 2018, ICML.

[14]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[15]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[16]  Douwe Kiela,et al.  Poincaré Embeddings for Learning Hierarchical Representations , 2017, NIPS.

[17]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[18]  Thomas Hofmann,et al.  Hyperbolic Neural Networks , 2018, NeurIPS.

[19]  Xiao Wang,et al.  Hyperbolic Heterogeneous Information Network Embedding , 2019, AAAI.

[20]  Huan Liu,et al.  Relational learning via latent social dimensions , 2009, KDD.

[21]  Douwe Kiela,et al.  Hyperbolic Graph Neural Networks , 2019, NeurIPS.

[22]  Chang Zhou,et al.  Scalable Graph Embedding for Asymmetric Proximity , 2017, AAAI.

[23]  Wei Lu,et al.  Deep Neural Networks for Learning Graph Representations , 2016, AAAI.

[24]  Nitesh V. Chawla,et al.  metapath2vec: Scalable Representation Learning for Heterogeneous Networks , 2017, KDD.

[25]  Renjie Liao,et al.  Lorentzian Distance Learning for Hyperbolic Representations , 2019, ICML.

[26]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[27]  Palash Goyal,et al.  Graph Embedding Techniques, Applications, and Performance: A Survey , 2017, Knowl. Based Syst..

[28]  Razvan Pascanu,et al.  Hyperbolic Attention Networks , 2018, ICLR.

[29]  Kevin Chen-Chuan Chang,et al.  A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[30]  Jure Leskovec,et al.  Hyperbolic Graph Convolutional Neural Networks , 2019, NeurIPS.

[31]  Wenwu Zhu,et al.  Structural Deep Network Embedding , 2016, KDD.

[32]  Thomas Hofmann,et al.  Hyperbolic Entailment Cones for Learning Hierarchical Embeddings , 2018, ICML.

[33]  Philip S. Yu,et al.  A Comprehensive Survey on Graph Neural Networks , 2019, IEEE Transactions on Neural Networks and Learning Systems.