Deep graph kernel point processes

Point process models are widely used to analyze asynchronous events occurring within a graph that reflect how different types of events influence one another. Predicting future events' times and types is a crucial task, and the size and topology of the graph add to the challenge of the problem. Recent neural point process models unveil the possibility of capturing intricate inter-event-category dependencies. However, such methods utilize an unfiltered history of events, including all event categories in the intensity computation for each target event type. In this work, we propose a graph point process method where event interactions occur based on a latent graph topology. The corresponding undirected graph has nodes representing event categories and edges indicating potential contribution relationships. We then develop a novel deep graph kernel to characterize the triggering and inhibiting effects between events. The intrinsic influence structures are incorporated via the graph neural network (GNN) model used to represent the learnable kernel. The computational efficiency of the GNN approach allows our model to scale to large graphs. Comprehensive experiments on synthetic and real-world data show the superior performance of our approach against the state-of-the-art methods in predicting future events and uncovering the relational structure among data.

[1]  Yuchen Li,et al.  Graph Neural Point Process for Temporal Interaction Prediction , 2023, IEEE Transactions on Knowledge and Data Engineering.

[2]  M. Bianchini,et al.  Graph Neural Networks for temporal graphs: State of the art, open challenges, and opportunities , 2023, ArXiv.

[3]  Xiuyuan Cheng,et al.  Spatio-temporal point processes with deep non-stationary kernels , 2022, ICLR.

[4]  Jianheng Tang,et al.  Rethinking Graph Neural Networks for Anomaly Detection , 2022, ICML.

[5]  Vijay Prakash Dwivedi,et al.  Recipe for a General, Powerful, Scalable Graph Transformer , 2022, NeurIPS.

[6]  Mihai Cucuringu,et al.  Graph similarity learning for change-point detection in dynamic networks , 2022, Mach. Learn..

[7]  Jason Eisner,et al.  Transformer Embeddings of Irregularly Spaced Events and Their Participants , 2021, ICLR.

[8]  Eran Yahav,et al.  How Attentive are Graph Attention Networks? , 2021, ICLR.

[9]  Hisashi Kashima,et al.  Dynamic Hawkes Processes for Discovering Time-evolving Communities' States behind Diffusion Processes , 2021, KDD.

[10]  Bryan Hooi,et al.  Graph Neural Network-Based Anomaly Detection in Multivariate Time Series , 2021, AAAI.

[11]  Emine Yilmaz,et al.  Learning Neural Point Processes with Latent Graphs , 2021, WWW.

[12]  Shuang Li,et al.  Imitation Learning of Neural Spatio-Temporal Point Processes , 2021, IEEE Transactions on Knowledge and Data Engineering.

[13]  Xavier Bresson,et al.  A Generalization of Transformer Networks to Graphs , 2020, ArXiv.

[14]  Qiang Qiu,et al.  Graph Convolution with Low-rank Learnable Local Filters , 2020, ICLR.

[15]  Emine Yilmaz,et al.  Self-Attentive Hawkes Process , 2020, ICML.

[16]  Haifeng Chen,et al.  Structural Temporal Graph Neural Networks for Anomaly Detection in Dynamic Graphs , 2020, CIKM.

[17]  Biao Cai,et al.  Latent Network Structure Learning from High Dimensional Multivariate Point Processes , 2020, Journal of the American Statistical Association.

[18]  Yao Xie,et al.  Convex Parameter Recovery for Interacting Marked Processes , 2020, IEEE Journal on Selected Areas in Information Theory.

[19]  Hongyuan Zha,et al.  Transformer Hawkes Process , 2020, ICML.

[20]  Stephan Günnemann,et al.  Intensity-Free Learning of Temporal Point Processes , 2019, ICLR.

[21]  Hongyuan Zha,et al.  Modeling Event Propagation via Graph Biased Temporal Point Process , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[22]  Mingxuan Sun,et al.  Geometric Hawkes Processes with Graph Convolutional Recurrent Neural Networks , 2019, AAAI.

[23]  K. Aihara,et al.  Fully Neural Network based Model for General Temporal Point Processes , 2019, NeurIPS.

[24]  Shixiang Zhu,et al.  Spatial-Temporal-Textual Point Processes for Crime Linkage Detection , 2019, 1902.00440.

[25]  Lorenzo Livi,et al.  Graph Neural Networks With Convolutional ARMA Filters , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[26]  Georgios B. Giannakis,et al.  A Recurrent Graph Neural Network for Multi-relational Data , 2018, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[27]  Stefan Klus,et al.  Singular Value Decomposition of Operators on Reproducing Kernel Hilbert Spaces , 2018, Advances in Dynamics, Optimization and Computation.

[28]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[29]  Alex Reinhart,et al.  A Review of Self-Exciting Spatio-Temporal Point Processes and Their Applications , 2017, Statistical Science.

[30]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[31]  Jason Eisner,et al.  The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process , 2016, NIPS.

[32]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[33]  Utkarsh Upadhyay,et al.  Recurrent Marked Temporal Point Processes: Embedding Event History to Vector , 2016, KDD.

[34]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[35]  Donald F. Towsley,et al.  Diffusion-Convolutional Neural Networks , 2015, NIPS.

[36]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[37]  Scott W. Linderman,et al.  Discovering Latent Network Structure in Point Process Data , 2014, ICML.

[38]  Joan Bruna,et al.  Spectral Networks and Locally Connected Networks on Graphs , 2013, ICLR.

[39]  Esko Valkeila,et al.  An Introduction to the Theory of Point Processes, Volume II: General Theory and Structure, 2nd Edition by Daryl J. Daley, David Vere‐Jones , 2008 .

[40]  Yosihiko Ogata,et al.  Statistical Models for Earthquake Occurrences and Residual Analysis for Point Processes , 1988 .

[41]  A. Hawkes Spectra of some self-exciting and mutually exciting point processes , 1971 .

[42]  Junchi Yan,et al.  Neural Relation Inference for Multi-dimensional Temporal Point Processes via Message Passing Graph , 2021, IJCAI.

[43]  Piotr Koniusz,et al.  Simple Spectral Graph Convolution , 2021, ICLR.

[44]  Yiming Yang,et al.  Correlation-Aware Change-Point Detection via Graph Neural Networks , 2020, ICONIP.

[45]  Philip S. Yu,et al.  A Comprehensive Survey on Graph Neural Networks , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[46]  Tian Gao,et al.  Proximal Graphical Event Models , 2018, NeurIPS.

[47]  HighWire Press Philosophical transactions of the Royal Society of London. Series A, Containing papers of a mathematical or physical character , 1896 .