Heterogeneous Deep Graph Infomax

Graph representation learning is to learn universal node representations that preserve both node attributes and structural information. The derived node representations can be used to serve various downstream tasks, such as node classification and node clustering. When a graph is heterogeneous, the problem becomes more challenging than the homogeneous graph node learning problem. Inspired by the emerging information theoretic-based learning algorithm, in this paper we propose an unsupervised graph neural network Heterogeneous Deep Graph Infomax (HDGI) for heterogeneous graph representation learning. We use the meta-path structure to analyze the connections involving semantics in heterogeneous graphs and utilize graph convolution module and semantic-level attention mechanism to capture local representations. By maximizing local-global mutual information, HDGI effectively learns high-level node representations that can be utilized in downstream graph-related tasks. Experiment results show that HDGI remarkably outperforms state-of-the-art unsupervised graph representation learning methods on both classification and clustering tasks. By feeding the learned representations into a parametric model, such as logistic regression, we even achieve comparable performance in node classification tasks when comparing with state-of-the-art supervised end-to-end GNN models.

[1]  Liefeng Bo,et al.  EnsemFDet: An Ensemble Approach to Fraud Detection based on Bipartite Graph , 2019, 2021 IEEE 37th International Conference on Data Engineering (ICDE).

[2]  Zhanxing Zhu,et al.  AdaGCN: Adaboosting Graph Convolutional Networks into Deep Models , 2019, ICLR.

[3]  Jiawei Zhang,et al.  HGAT: Hierarchical Graph Attention Network for Fake News Detection , 2020, ArXiv.

[4]  Jian Tang,et al.  InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization , 2019, ICLR.

[5]  Jiawei Zhang Graph Neural Networks for Small Graph and Giant Network Representation Learning: An Overview , 2019, ArXiv.

[6]  Yanfang Ye,et al.  Heterogeneous Graph Attention Network , 2019, WWW.

[7]  Nitesh V. Chawla,et al.  SHNE: Representation Learning for Semantic-Associated Heterogeneous Networks , 2019, WSDM.

[8]  Jure Leskovec,et al.  How Powerful are Graph Neural Networks? , 2018, ICLR.

[9]  Pietro Liò,et al.  Deep Graph Infomax , 2018, ICLR.

[10]  Yoshua Bengio,et al.  Learning deep representations by mutual information estimation and maximization , 2018, ICLR.

[11]  Jian Pei,et al.  A Survey on Network Embedding , 2017, IEEE Transactions on Knowledge and Data Engineering.

[12]  Junzhou Huang,et al.  Adaptive Sampling Towards Fast Graph Representation Learning , 2018, NeurIPS.

[13]  Jiawei Zhang,et al.  Social Network Fusion and Mining: A Survey , 2018, ArXiv.

[14]  Jure Leskovec,et al.  GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models , 2018, ICML.

[15]  Aaron C. Courville,et al.  MINE: Mutual Information Neural Estimation , 2018, ArXiv.

[16]  Heinrich Müller,et al.  SplineCNN: Fast Geometric Deep Learning with Continuous B-Spline Kernels , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[17]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[18]  Max Welling,et al.  Modeling Relational Data with Graph Convolutional Networks , 2017, ESWC.

[19]  Zhendong Mao,et al.  Knowledge Graph Embedding: A Survey of Approaches and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[20]  Alex Fout,et al.  Protein Interface Prediction using Graph Convolutional Networks , 2017, NIPS.

[21]  Wang-Chien Lee,et al.  HIN2Vec: Explore Meta-paths in Heterogeneous Information Networks for Representation Learning , 2017, CIKM.

[22]  Mathias Niepert,et al.  Learning Graph Representations with Embedding Propagation , 2017, NIPS.

[23]  Jure Leskovec,et al.  Representation Learning on Graphs: Methods and Applications , 2017, IEEE Data Eng. Bull..

[24]  Nitesh V. Chawla,et al.  metapath2vec: Scalable Representation Learning for Heterogeneous Networks , 2017, KDD.

[25]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[26]  Jian Pei,et al.  Community Preserving Network Embedding , 2017, AAAI.

[27]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[28]  Max Welling,et al.  Variational Graph Auto-Encoders , 2016, ArXiv.

[29]  Chengqi Zhang,et al.  Collective Classification via Discriminative Matrix Factorization on Sparsely Labeled Networks , 2016, CIKM.

[30]  Jian Pei,et al.  Asymmetric Transitivity Preserving Graph Embedding , 2016, KDD.

[31]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[32]  Alexander J. Smola,et al.  Explaining Reviews and Ratings with PACO: Poisson Additive Co-Clustering , 2015, WWW.

[33]  Samy Bengio,et al.  Order Matters: Sequence to sequence for sets , 2015, ICLR.

[34]  Richard S. Zemel,et al.  Gated Graph Sequence Neural Networks , 2015, ICLR.

[35]  S. F. Begum,et al.  Meta Path Based Top-K Similarity Join In Heterogeneous Information Networks , 2016 .

[36]  Deli Zhao,et al.  Network Representation Learning with Rich Text Information , 2015, IJCAI.

[37]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[38]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[39]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[40]  Joan Bruna,et al.  Spectral Networks and Locally Connected Networks on Graphs , 2013, ICLR.

[41]  Philip S. Yu,et al.  PathSim , 2011, Proc. VLDB Endow..

[42]  Yizhou Sun,et al.  Graph-based Consensus Maximization among Multiple Supervised and Unsupervised Models , 2009, NIPS.

[43]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.