Efficient-Dyn: Dynamic Graph Representation Learning via Event-based Temporal Sparse Attention Network

GStatic graph neural networks have been widely used in modeling and representation learning of graph structure data. However, many real-world problems, such as social networks, financial transactions, recommendation systems, etc., are dynamic, that is, nodes and edges are added or deleted over time. Therefore, in recent years, dynamic graph neural networks have received more and more attention from researchers. In this work, we propose a novel dynamic graph neural network, Efficient-Dyn. It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure. Therefore, while avoiding the use of snapshots to cause information loss, it also achieves a finer time granularity, which is close to what continuous networks could provide. In addition, we also designed a lightweight module, Sparse Temporal Transformer, to compute node representations through both structural neighborhoods and temporal dynamics. Since the fully-connected attention conjunction is simplified, the computation cost is far lower than the current state-of-the-arts. Link prediction experiments are conducted on both continuous and discrete graph datasets. Through comparing with several state-of-the-art graph embedding baselines, the experimental results demonstrate that Efficient-Dyn has a faster inference speed while having competitive performance.

[1]  Brahim Chaib-draa,et al.  Parametric Exponential Linear Unit for Deep Convolutional Neural Networks , 2016, 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA).

[2]  Jiliang Tang,et al.  Streaming Graph Neural Networks , 2018, SIGIR.

[3]  Kevin Chen-Chuan Chang,et al.  A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[4]  Jure Leskovec,et al.  Graph Convolutional Neural Networks for Web-Scale Recommender Systems , 2018, KDD.

[5]  Katarzyna Musial,et al.  Foundations and Modeling of Dynamic Networks Using Dynamic Graph Neural Networks: A Survey , 2020, IEEE Access.

[6]  Pascal Poupart,et al.  Representation Learning for Dynamic Graphs: A Survey , 2020, J. Mach. Learn. Res..

[7]  F. Maxwell Harper,et al.  The MovieLens Datasets: History and Context , 2016, TIIS.

[8]  Alex Fout,et al.  Protein Interface Prediction using Graph Convolutional Networks , 2017, NIPS.

[9]  Kathleen M. Carley,et al.  Patterns and dynamics of users' behavior and interaction: Network analysis of an online community , 2009, J. Assoc. Inf. Sci. Technol..

[10]  Jian Pei,et al.  A Survey on Network Embedding , 2017, IEEE Transactions on Knowledge and Data Engineering.

[11]  Da Xu,et al.  Inductive Representation Learning on Temporal Graphs , 2020, ICLR.

[12]  Matthieu Latapy,et al.  Stream graphs and link streams for the modeling of interactions over time , 2017, Social Network Analysis and Mining.

[13]  Yoshua Bengio,et al.  Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.

[14]  Aynaz Taheri,et al.  Learning to Represent the Evolution of Dynamic Graphs with Recurrent Models , 2019, WWW.

[15]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[16]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[17]  Gerhard Nahler,et al.  Pearson Correlation Coefficient , 2020, Definitions.

[18]  Zhen Wang,et al.  Knowledge Graph Embedding by Translating on Hyperplanes , 2014, AAAI.

[19]  Liang Gou,et al.  DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks , 2020, WSDM.

[20]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[21]  Yueting Zhuang,et al.  Dynamic Network Embedding by Modeling Triadic Closure Process , 2018, AAAI.

[22]  E. Klein,et al.  Fostering Neuroethics Integration with Neuroscience in the BRAIN Initiative: Comments on the NIH Neuroethics Roadmap , 2020, AJOB neuroscience.

[23]  Jure Leskovec,et al.  Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks , 2019, KDD.

[24]  Yiming Yang,et al.  The Enron Corpus: A New Dataset for Email Classi(cid:12)cation Research , 2004 .

[25]  Stephan Günnemann,et al.  Graph Hawkes Network for Reasoning on Temporal Knowledge Graphs , 2020, ArXiv.

[26]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[27]  Davide Eynard,et al.  Temporal Graph Networks for Deep Learning on Dynamic Graphs , 2020, ArXiv.

[28]  Gorjan Alagic,et al.  #p , 2019, Quantum information & computation.

[29]  Dongqi Fu,et al.  SDG: A Simplified and Dynamic Graph Neural Network , 2021, SIGIR.

[30]  Hongyuan Zha,et al.  DyRep: Learning Representations over Dynamic Graphs , 2019, ICLR.

[31]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[32]  Palash Goyal,et al.  dyngraph2vec: Capturing Network Dynamics using Dynamic Graph Representation Learning , 2018, Knowl. Based Syst..