DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks

Learning node representations in graphs is important for many applications such as link prediction, node classification, and community detection. Existing graph representation learning methods primarily target static graphs while many real-world graphs evolve over time. Complex time-varying graph structures make it challenging to learn informative node representations over time. We present Dynamic Self-Attention Network (DySAT), a novel neural architecture that learns node representations to capture dynamic graph structural evolution. Specifically, DySAT computes node representations through joint self-attention along the two dimensions of structural neighborhood and temporal dynamics. Compared with state-of-the-art recurrent methods modeling graph evolution, dynamic self-attention is efficient, while achieving consistently superior performance. We conduct link prediction experiments on two graph types: communication networks and bipartite rating networks. Experimental results demonstrate significant performance gains for DySAT over several state-of-the-art graph embedding baselines, in both single and multi-step link prediction tasks. Furthermore, our ablation study validates the effectiveness of jointly modeling structural and temporal self-attention.

[1]  A. Moore,et al.  Dynamic social network analysis using latent space models , 2005, SKDD.

[2]  Le Song,et al.  Know-Evolve: Deep Temporal Reasoning for Dynamic Knowledge Graphs , 2017, ICML.

[3]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[4]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[5]  Yidong Chen,et al.  Deep Semantic Role Labeling with Self-Attention , 2017, AAAI.

[6]  Palash Goyal,et al.  dyngraph2vec: Capturing Network Dynamics using Dynamic Graph Representation Learning , 2018, Knowl. Based Syst..

[7]  Yann Dauphin,et al.  Convolutional Sequence to Sequence Learning , 2017, ICML.

[8]  Junjie Wu,et al.  Embedding Temporal Network via Neighborhood Formation , 2018, KDD.

[9]  Tao Shen,et al.  DiSAN: Directional Self-Attention Network for RNN/CNN-free Language Understanding , 2017, AAAI.

[10]  Kevin Chen-Chuan Chang,et al.  Motif-based Convolutional Neural Network on Graphs , 2017, ArXiv.

[11]  Xiaoning Qian,et al.  Variational Graph Recurrent Neural Networks , 2019, NeurIPS.

[12]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[13]  Guojie Song,et al.  Dynamic Network Embedding : An Extended Approach for Skip-gram based Network Embedding , 2018, IJCAI.

[14]  Jian Pei,et al.  TIMERS: Error-Bounded SVD Restart on Dynamic Networks , 2017, AAAI.

[15]  Hari Sundaram,et al.  A Modular Adversarial Approach to Social Recommendation , 2019, CIKM.

[16]  Martín Abadi,et al.  TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems , 2016, ArXiv.

[17]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[18]  Junting Wang,et al.  An Induced Multi-Relational Framework for Answer Selection in Community Question Answer Platforms , 2019, ArXiv.

[19]  Quoc V. Le,et al.  QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension , 2018, ICLR.

[20]  Christos Faloutsos,et al.  Graph evolution: Densification and shrinking diameters , 2006, TKDD.

[21]  Hongyuan Zha,et al.  Representation Learning over Dynamic Graphs , 2018, ArXiv.

[22]  Ryan A. Rossi,et al.  Continuous-Time Dynamic Network Embeddings , 2018, WWW.

[23]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[24]  Pietro Liò,et al.  Deep Graph Infomax , 2018, ICLR.

[25]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[26]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[27]  Zhen Wang,et al.  Knowledge Graph Embedding by Translating on Hyperplanes , 2014, AAAI.

[28]  Wei Zhang,et al.  Dynamic Graph Representation Learning via Self-Attention Networks , 2018, ArXiv.

[29]  Jure Leskovec,et al.  Modeling Polypharmacy Side Effects with Graph Convolutional Networks , 2018 .

[30]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[31]  Huan Liu,et al.  Attributed Network Embedding for Learning in a Dynamic Environment , 2017, CIKM.

[32]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[33]  Kathleen M. Carley,et al.  Patterns and dynamics of users' behavior and interaction: Network analysis of an online community , 2009, J. Assoc. Inf. Sci. Technol..

[34]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[35]  Yiming Yang,et al.  Introducing the Enron Corpus , 2004, CEAS.

[36]  Ping-Chun Hsieh,et al.  Streaming Network Embedding through Local Actions , 2018, ArXiv.

[37]  F. Maxwell Harper,et al.  The MovieLens Datasets: History and Context , 2016, TIIS.

[38]  Yan Liu,et al.  DynGEM: Deep Embedding Method for Dynamic Graphs , 2018, ArXiv.

[39]  Kevin Chen-Chuan Chang,et al.  Meta-GNN: Metagraph Neural Network for Semi-supervised learning in Attributed Heterogeneous Information Networks , 2019, 2019 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM).

[40]  Carl Yang,et al.  RASE: Relationship Aware Social Embedding , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[41]  Yueting Zhuang,et al.  Dynamic Network Embedding by Modeling Triadic Closure Process , 2018, AAAI.