Heterogeneous Information Network Embedding With Adversarial Disentangler

Heterogeneous information network (HIN) embedding has gained considerable attention in recent years, which learns low-dimensional representation of nodes while preserving the semantic and structural correlations in HINs. Many of existing methods which exploit meta-path guided strategy have shown promising results. However, the learned node representations could be highly entangled for downstream tasks; for example, an author's publications in multidisciplinary venues may make the prediction of his/her research interests difficult. To address this issue, we develop a novel framework named HEAD (i.e., HIN Embedding with Adversarial Disentangler) to separate the distinct, informative factors of variations in node semantics formulated by meta-paths. More specifically, in HEAD, we first propose the meta-path disentangler to separate node embeddings from various meta-paths into intrinsic and specific spaces; then with meta-path schemes as self-supervised information, we design two adversarial learners (i.e., meta-path and semantic discriminators) to make the intrinsic embedding more independent from the designed meta-paths while the specific embedding more meta-path dependent. To comprehensively evaluate the performance of HEAD, we perform a set of experiments on four real-world datasets. Compared to the state-of-the-art baselines, the maximum 15% improvement of performance demonstrates the effectiveness of HEAD and the benefits of the learned disentangled representations.

[1]  Qiongkai Xu,et al.  GraRep: Learning Graph Representations with Global Structural Information , 2015, CIKM.

[2]  Jian Pei,et al.  Asymmetric Transitivity Preserving Graph Embedding , 2016, KDD.

[3]  Philip S. Yu,et al.  A Survey of Heterogeneous Information Network Analysis , 2015, IEEE Transactions on Knowledge and Data Engineering.

[4]  Jian Pei,et al.  A Survey on Network Embedding , 2017, IEEE Transactions on Knowledge and Data Engineering.

[5]  Nitesh V. Chawla,et al.  metapath2vec: Scalable Representation Learning for Heterogeneous Networks , 2017, KDD.

[6]  Chuan Shi,et al.  Adversarial Learning on Heterogeneous Information Networks , 2019, KDD.

[7]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[8]  Jian Pei,et al.  Community Preserving Network Embedding , 2017, AAAI.

[9]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[10]  Charu C. Aggarwal,et al.  Co-author Relationship Prediction in Heterogeneous Bibliographic Networks , 2011, 2011 International Conference on Advances in Social Networks Analysis and Mining.

[11]  Yanfang Ye,et al.  Heterogeneous Graph Attention Network , 2019, WWW.

[12]  Alexander A. Alemi,et al.  Deep Variational Information Bottleneck , 2017, ICLR.

[13]  Jonathon Shlens,et al.  Conditional Image Synthesis with Auxiliary Classifier GANs , 2016, ICML.

[14]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[15]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[16]  Wang-Chien Lee,et al.  HIN2Vec: Explore Meta-paths in Heterogeneous Information Networks for Representation Learning , 2017, CIKM.

[17]  Joshua B. Tenenbaum,et al.  Deep Convolutional Inverse Graphics Network , 2015, NIPS.

[18]  Jingrui He,et al.  SPARC: Self-Paced Network Representation for Few-Shot Rare Category Characterization , 2018, KDD.

[19]  Pieter Abbeel,et al.  InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets , 2016, NIPS.

[20]  Sergey Ioffe,et al.  Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.

[21]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[22]  Hao Wang,et al.  PME: Projected Metric Embedding on Heterogeneous Networks for Link Prediction , 2018, KDD.

[23]  Qiaozhu Mei,et al.  PTE: Predictive Text Embedding through Large-scale Heterogeneous Text Networks , 2015, KDD.

[24]  Jingrui He,et al.  Towards Explainable Representation of Time-Evolving Graphs via Spatial-Temporal Graph Attention Networks , 2019, CIKM.

[25]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[26]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[27]  Philip S. Yu,et al.  PathSim , 2011, Proc. VLDB Endow..

[28]  Kevin Chen-Chuan Chang,et al.  A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[29]  Chengqi Zhang,et al.  Network Representation Learning: A Survey , 2017, IEEE Transactions on Big Data.

[30]  Wenwu Zhu,et al.  Structural Deep Network Embedding , 2016, KDD.

[31]  Chengqi Zhang,et al.  Homophily, Structure, and Content Augmented Network Representation Learning , 2016, 2016 IEEE 16th International Conference on Data Mining (ICDM).

[32]  Yada Zhu,et al.  Towards Fine-Grained Temporal Network Representation via Time-Reinforced Random Walk , 2020, AAAI.

[33]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[34]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[35]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[36]  Yizhou Sun,et al.  Heterogeneous Network Representation Learning: Survey, Benchmark, Evaluation, and Beyond , 2020, ArXiv.

[37]  Deli Zhao,et al.  Network Representation Learning with Rich Text Information , 2015, IJCAI.

[38]  Daan Wierstra,et al.  Stochastic Backpropagation and Approximate Inference in Deep Generative Models , 2014, ICML.

[39]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[40]  Philip S. Yu,et al.  Embedding of Embedding (EOE): Joint Embedding for Coupled Heterogeneous Networks , 2017, WSDM.

[41]  Xiao Wang,et al.  Independence Promoted Graph Disentangled Networks , 2019, AAAI.

[42]  Andrew L. Maas Rectifier Nonlinearities Improve Neural Network Acoustic Models , 2013 .

[43]  Ludovic Denoyer,et al.  Learning latent representations of nodes for classifying in heterogeneous social networks , 2014, WSDM.

[44]  Bernt Schiele,et al.  Generative Adversarial Text to Image Synthesis , 2016, ICML.

[45]  Philip S. Yu,et al.  Heterogeneous Information Network Analysis and Applications , 2017, Data Analytics.

[46]  Christopher Burgess,et al.  beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework , 2016, ICLR 2016.

[47]  Yizhou Sun,et al.  Mining heterogeneous information networks: a structural analysis approach , 2013, SKDD.

[48]  Jiawei Han,et al.  A Data-Driven Graph Generative Model for Temporal Interaction Networks , 2020, KDD.

[49]  Shanshan Li,et al.  Deep Collective Classification in Heterogeneous Information Networks , 2018, WWW.

[50]  Andriy Mnih,et al.  Disentangling by Factorising , 2018, ICML.

[51]  Nitesh V. Chawla,et al.  Heterogeneous Graph Neural Network , 2019, KDD.

[52]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[53]  Minyi Guo,et al.  SHINE: Signed Heterogeneous Information Network Embedding for Sentiment Link Prediction , 2017, WSDM.

[54]  Andrew McCallum,et al.  Disambiguating Web appearances of people in a social network , 2005, WWW '05.

[55]  Philip S. Yu,et al.  Heterogeneous Information Network Embedding for Recommendation , 2017, IEEE Transactions on Knowledge and Data Engineering.

[56]  Philip S. Yu,et al.  Semantic Path based Personalized Recommendation on Weighted Heterogeneous Information Networks , 2015, CIKM.

[57]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[58]  Yuan Qi,et al.  Cash-Out User Detection Based on Attributed Heterogeneous Information Network with a Hierarchical Attention Mechanism , 2019, AAAI.

[59]  Wenwu Zhu,et al.  Disentangled Graph Convolutional Networks , 2019, ICML.

[60]  Pascal Vincent,et al.  Representation Learning: A Review and New Perspectives , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.