TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning

Dynamic graph modeling has recently attracted much attention due to its extensive applications in many real-world scenarios, such as recommendation systems, financial transactions, and social networks. Although many works have been proposed for dynamic graph modeling in recent years, effective and scalable models are yet to be developed. In this paper, we propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion and enables effective dynamic node representation learning that captures both the temporal and topology information. Technically, our model contains three novel aspects. First, we generalize the vanilla Transformer to temporal graph learning scenarios and design a graph-topology-aware transformer. Secondly, on top of the proposed graph transformer, we introduce a two-stream encoder that separately extracts representations from temporal neighborhoods associated with the two interaction nodes and then utilizes a co-attentional transformer to model inter-dependencies at a semantic level. Lastly, we are inspired by the recently developed contrastive learning and propose to optimize our model by maximizing mutual information (MI) between the predictive representations of two future interaction nodes. Benefiting from this, our dynamic representations can preserve high-level (or global) semantics about interactions and thus is robust to noisy interactions. To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs. We evaluate our model on four bench-mark datasets for interaction prediction and experiment results demonstrate the superiority of our model.

[1]  Jure Leskovec,et al.  Modeling polypharmacy side effects with graph convolutional networks , 2018, bioRxiv.

[2]  Liang Gou,et al.  DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks , 2020, WSDM.

[3]  Jun Zhao,et al.  IntentGC: A Scalable Graph Convolution Framework Fusing Heterogeneous Information for Recommendation , 2019, KDD.

[4]  Ryan A. Rossi,et al.  Continuous-Time Dynamic Network Embeddings , 2018, WWW.

[5]  Davide Eynard,et al.  Temporal Graph Networks for Deep Learning on Dynamic Graphs , 2020, ArXiv.

[6]  Yan Liu,et al.  DynGEM: Deep Embedding Method for Dynamic Graphs , 2018, ArXiv.

[7]  Kaveh Hassani,et al.  Contrastive Multi-View Representation Learning on Graphs , 2020, ICML.

[8]  Da Xu,et al.  Inductive Representation Learning on Temporal Graphs , 2020, ICLR.

[9]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[10]  C. Bayan Bruss,et al.  DeepTrax: Embedding Graphs of Financial Transactions , 2019, 2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA).

[11]  Jian Tang,et al.  InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization , 2019, ICLR.

[12]  Xiangnan He,et al.  LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation , 2020, SIGIR.

[13]  Geoffrey E. Hinton,et al.  A Simple Framework for Contrastive Learning of Visual Representations , 2020, ICML.

[14]  Charles E. Leisersen,et al.  EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs , 2019, AAAI.

[15]  Jure Leskovec,et al.  {SNAP Datasets}: {Stanford} Large Network Dataset Collection , 2014 .

[16]  Michael Tschannen,et al.  On Mutual Information Maximization for Representation Learning , 2019, ICLR.

[17]  Alexander A. Alemi,et al.  On Variational Bounds of Mutual Information , 2019, ICML.

[18]  Hongyuan Zha,et al.  DyRep: Learning Representations over Dynamic Graphs , 2019, ICLR.

[19]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[20]  Yueting Zhuang,et al.  Dynamic Network Embedding by Modeling Triadic Closure Process , 2018, AAAI.

[21]  Tassilo Klein,et al.  Contrastive Self-Supervised Learning for Commonsense Reasoning , 2020, ACL.

[22]  Understanding Contrastive Representation Learning through Alignment and Uniformity on the Hypersphere , 2020, ICML.

[23]  Le Song,et al.  Deep Coevolutionary Network: Embedding User and Item Features for Recommendation , 2016, 1609.03675.

[24]  Christopher D. Manning,et al.  Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.

[25]  Pietro Liò,et al.  Deep Graph Infomax , 2018, ICLR.

[26]  Meng Wang,et al.  SocialGCN: An Efficient Graph Convolutional Network based Model for Social Recommendation , 2018, ArXiv.

[27]  Yuan Qi,et al.  Continuous-Time Dynamic Graph Learning via Neural Interaction Processes , 2020, CIKM.

[28]  Geoffrey E. Hinton,et al.  Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.

[29]  Oriol Vinyals,et al.  Representation Learning with Contrastive Predictive Coding , 2018, ArXiv.

[30]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[31]  Jure Leskovec,et al.  Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks , 2019, KDD.

[32]  Jure Leskovec,et al.  Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation , 2018, NeurIPS.

[33]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.