Learning Temporal Interaction Graph Embedding via Coupled Memory Networks

Graph embedding has become the research focus in both academic and industrial communities due to its powerful capabilities. The majority of existing work overwhelmingly learn node embeddings in the context of static, plain or attributed, homogeneous graphs. However, many real-world applications frequently involve bipartite graphs with temporal and attributed interaction edges, named temporal interaction graphs. The temporal interactions usually imply different facets of interest and might even evolve over time, thus putting forward huge challenges in learning effective node representations. In this paper, we propose a novel framework named TigeCMN to learn node representations from a sequence of temporal interactions. Specifically, we devise two coupled memory networks to store and update node embeddings in external matrices explicitly and dynamically, which forms deep matrix representations and could enhance the expressiveness of the node embeddings. We conduct experiments on two real-world datasets and the experimental results empirically demonstrate that TigeCMN can outperform the state-of-the-arts with different gains.

[1]  Jason Weston,et al.  End-To-End Memory Networks , 2015, NIPS.

[2]  Ruslan Salakhutdinov,et al.  Revisiting Semi-Supervised Learning with Graph Embeddings , 2016, ICML.

[3]  Richard Socher,et al.  Ask Me Anything: Dynamic Memory Networks for Natural Language Processing , 2015, ICML.

[4]  Junjie Wu,et al.  Embedding Temporal Network via Neighborhood Formation , 2018, KDD.

[5]  Yueting Zhuang,et al.  Dynamic Network Embedding by Modeling Triadic Closure Process , 2018, AAAI.

[6]  Heng Huang,et al.  Self-Paced Network Embedding , 2018, KDD.

[7]  Linyuan Lu,et al.  Link Prediction in Complex Networks: A Survey , 2010, ArXiv.

[8]  Nitesh V. Chawla,et al.  metapath2vec: Scalable Representation Learning for Heterogeneous Networks , 2017, KDD.

[9]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[10]  Dit-Yan Yeung,et al.  Dynamic Key-Value Memory Networks for Knowledge Tracing , 2016, WWW.

[11]  Bin Shen,et al.  Collaborative Memory Network for Recommendation Systems , 2018, SIGIR.

[12]  Huan Liu,et al.  Attributed Network Embedding for Learning in a Dynamic Environment , 2017, CIKM.

[13]  Hong Cheng,et al.  Graph Clustering Based on Structural/Attribute Similarities , 2009, Proc. VLDB Endow..

[14]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[15]  Ming Gao,et al.  BiNE: Bipartite Network Embedding , 2018, SIGIR.

[16]  Charu C. Aggarwal,et al.  Heterogeneous Network Embedding via Deep Architectures , 2015, KDD.

[17]  Sergio Gomez Colmenarejo,et al.  Hybrid computing using a neural network with dynamic external memory , 2016, Nature.

[18]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[19]  Yongfeng Zhang,et al.  Sequential Recommendation with User Memory Networks , 2018, WSDM.

[20]  Zi Huang,et al.  Neural Memory Streaming Recommender Networks with Adversarial Training , 2018, KDD.

[21]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[22]  Yao Zhang,et al.  Learning Node Embeddings in Interaction Graphs , 2017, CIKM.

[23]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[24]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[25]  Jason Weston,et al.  Key-Value Memory Networks for Directly Reading Documents , 2016, EMNLP.

[26]  Jason Weston,et al.  Memory Networks , 2014, ICLR.

[27]  Phil Blunsom,et al.  Learning to Transduce with Unbounded Memory , 2015, NIPS.

[28]  Lina Yao,et al.  Next Item Recommendation with Self-Attention , 2018, ArXiv.

[29]  Yan Liu,et al.  DynGEM: Deep Embedding Method for Dynamic Graphs , 2018, ArXiv.

[30]  Qiongkai Xu,et al.  GraRep: Learning Graph Representations with Global Structural Information , 2015, CIKM.

[31]  Richard Socher,et al.  Dynamic Memory Networks for Visual and Textual Question Answering , 2016, ICML.

[32]  Wenwu Zhu,et al.  Structural Deep Network Embedding , 2016, KDD.

[33]  Graham Cormode,et al.  Node Classification in Social Networks , 2011, Social Network Data Analytics.

[34]  Jure Leskovec,et al.  Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks , 2019, KDD.

[35]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[36]  Hongyuan Zha,et al.  Representation Learning over Dynamic Graphs , 2018, ArXiv.

[37]  Mark Heimann,et al.  node2bits: Compact Time- and Attribute-aware Node Representations for User Stitching , 2019, ECML/PKDD.

[38]  Ryan A. Rossi,et al.  Continuous-Time Dynamic Network Embeddings , 2018, WWW.

[39]  Chengqi Zhang,et al.  Tri-Party Deep Network Representation , 2016, IJCAI.

[40]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[41]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[42]  Jiajun Bu,et al.  ANRL: Attributed Network Representation Learning via Deep Neural Networks , 2018, IJCAI.

[43]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[44]  Philip S. Yu,et al.  Temporal Network Embedding with Micro- and Macro-dynamics , 2019, CIKM.