Multi-Aspect Temporal Network Embedding: A Mixture of Hawkes Process View

Recent years have witnessed the tremendous research interests in network embedding. Extant works have taken the neighborhood formation as the critical information to reveal the inherent dynamics of network structures, and suggested encoding temporal edge formation sequences to capture the historical influences of neighbors. In this paper, however, we argue that the edge formation can be attributed to a variety of driving factors including the temporal influence, which is better referred to as multiple aspects. As a matter of fact, different node aspects can drive the formation of distinctive neighbors, giving birth to the multi-aspect embedding that relates to but goes beyond a temporal scope. Along this vein, we propose a Mixture of Hawkes-based Temporal Network Embeddings (MHNE) model to capture the aspect-driven neighborhood formation of networks. In MHNE, we encode the multi-aspect embeddings into the mixture of Hawkes processes to gain the advantages in modeling the excitation effects and the latent aspects. Specifically, a graph attention mechanism is used to assign different weights to account for the excitation effects of history events, while a Gumbel-Softmax is plugged in to derive the distribution over the aspects. Extensive experiments on 8 different temporal networks have demonstrated the great performance of the multi-aspect embeddings obtained by MHNE in comparison with the state-of-the-art methods.

[1]  Donghyun Kim,et al.  Unsupervised Differentiable Multi-aspect Network Embedding , 2020, KDD.

[2]  Charles E. Leisersen,et al.  EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs , 2019, AAAI.

[3]  Philip S. Yu,et al.  Temporal Network Embedding with Micro- and Macro-dynamics , 2019, CIKM.

[4]  Huzefa Rangwala,et al.  Learning Dynamic Context Graphs for Predicting Social Events , 2019, KDD.

[5]  Hao Wang,et al.  MCNE: An End-to-End Framework for Learning Multiple Conditional Network Representations of Social Network , 2019, KDD.

[6]  Hongxia Yang,et al.  Is a Single Vector Enough?: Exploring Node Polysemy for Network Embedding , 2019, KDD.

[7]  Alessandro Epasto,et al.  Is a Single Embedding Enough? Learning Node Representations that Capture Multiple Social Contexts , 2019, WWW.

[8]  Jian Pei,et al.  A Survey on Network Embedding , 2017, IEEE Transactions on Knowledge and Data Engineering.

[9]  Junjie Wu,et al.  Embedding Temporal Network via Neighborhood Formation , 2018, KDD.

[10]  Yan Liu,et al.  DynGEM: Deep Embedding Method for Dynamic Graphs , 2018, ArXiv.

[11]  Yueting Zhuang,et al.  Dynamic Network Embedding by Modeling Triadic Closure Process , 2018, AAAI.

[12]  Xiao-Ming Wu,et al.  Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning , 2018, AAAI.

[13]  Dan Wang,et al.  Adversarial Network Embedding , 2017, AAAI.

[14]  Xiaochun Cao,et al.  Multi-Facet Network Embedding: Beyond the General Solution of Detection and Representation , 2018, AAAI.

[15]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[16]  Sami Abu-El-Haija,et al.  Learning Edge Representations via Low-Rank Asymmetric Projections , 2017, CIKM.

[17]  Jason Eisner,et al.  The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process , 2016, NIPS.

[18]  Ben Poole,et al.  Categorical Reparameterization with Gumbel-Softmax , 2016, ICLR.

[19]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[20]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[21]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[22]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[23]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[24]  Shuang-Hong Yang,et al.  Mixture of Mutually Exciting Processes for Viral Diffusion , 2013, ICML.

[25]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.