Improving Gaussian Embedding for Extracting Local Semantic Connectivity in Networks

Gaussian embedding in unsupervised graph representation learning aims to embed vertices into Gaussian distributions. Downstream tasks such as link prediction and node classification can be efficiently computed on Gaussian distributions. Existing Gaussian embedding methods depending on adjacent neighbors of vertexes leave out of consideration of indirect connectivity information carried by remote vertice, which is still a part of local semantic connectivity in a network. In this paper, we propose an unsupervised graph representation model GLP2Gauss to improve Gaussian embedding by coarsening graph as indirect local connectivity. First, we decompose the original graph into paths and subgraphs. Secondly, we present a path-based embedding strategy combined with the Gaussian embedding method to maintain better local relevance of embedded nodes. Experiments evaluated on benchmark datasets show that GLP2Gauss has competitive performance on node classification and link prediction for networks compared with off-the-shelf network representation models.

[1]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[2]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[3]  Bernhard Schölkopf,et al.  Learning with Local and Global Consistency , 2003, NIPS.

[4]  S. Gunn Support Vector Machines for Classification and Regression , 1998 .

[5]  Andrew McCallum,et al.  Word Representations via Gaussian Embedding , 2014, ICLR.

[6]  Fu Jie Huang,et al.  A Tutorial on Energy-Based Learning , 2006 .

[7]  Jun Zhao,et al.  Learning to Represent Knowledge Graphs with Gaussian Embedding , 2015, CIKM.

[8]  Kevin Chen-Chuan Chang,et al.  A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[9]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[10]  Deli Zhao,et al.  Network Representation Learning with Rich Text Information , 2015, IJCAI.

[11]  C. Givens,et al.  A class of Wasserstein metrics for probability distributions. , 1984 .

[12]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[13]  Jure Leskovec,et al.  Representation Learning on Graphs: Methods and Applications , 2017, IEEE Data Eng. Bull..

[14]  Stephan Günnemann,et al.  Deep Gaussian Embedding of Attributed Graphs: Unsupervised Inductive Learning via Ranking , 2017, ArXiv.

[15]  Le Song,et al.  Adversarial Attack on Graph Structured Data , 2018, ICML.

[16]  Kihyuk Sohn,et al.  Improved Deep Metric Learning with Multi-class N-pair Loss Objective , 2016, NIPS.

[17]  Chengqi Zhang,et al.  Tri-Party Deep Network Representation , 2016, IJCAI.

[18]  Nitesh V. Chawla,et al.  metapath2vec: Scalable Representation Learning for Heterogeneous Networks , 2017, KDD.

[19]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[20]  Wenwu Zhu,et al.  Deep Variational Network Embedding in Wasserstein Space , 2018, KDD.

[21]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[22]  Wenwu Zhu,et al.  Structural Deep Network Embedding , 2016, KDD.

[23]  Jure Leskovec,et al.  Graph Convolutional Neural Networks for Web-Scale Recommender Systems , 2018, KDD.