Local Event Forecasting and Synthesis Using Unpaired Deep Graph Translations

Local rare event forecasting and synthesis on networks are highly useful for emergence management. For example, synthesizing traffic congestion and disease diffusion over the road network and disease-contact network respectively of specific geo-locations is highly important for transportation planning and disease outbreaks intervention. This task requires to learn how the events of congestion or disease "translate" the graph patterns from source mode (e.g., without event) to target mode (e.g., with event) based on historical data for some locations. Then it needs to apply such "translation" upon a source-mode graph pattern in a new location's network, in order to estimate and foresee what it will look like in target-mode in this location. Such task is called graph translation, which is an analogy and generalization to image and text translation. Similar to the situations in image and text translation, paired training data, which consists of pairs of source-mode graph and its corresponding target-mode, will usually not be available. In this work, we propose an approach for learn the translation of graphs from source-mode to target-mode such that the generated target-mode is indistinguishable from the distribution of the real target-mode using an adversarial loss. Because there is no paired training data, we also learn an inverse translation from target-mode to source-mode and couple these two translation mappings through cycle consistency loss. Extensive experiments on both synthetic and real-world application data demonstrate that the proposed approaches is capable of generating graphs close to real target graphs. Case studies on the synthesized networks have also been illustrated and analyzed to show the reasonableness of the generated target-mode graphs.

[1]  P. Erdos,et al.  On the evolution of random graphs , 1984 .

[2]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[3]  B. Bollobás The evolution of random graphs , 1984 .

[4]  Jieping Ye,et al.  Multi-Task Learning for Spatio-Temporal Event Forecasting , 2015, KDD.

[5]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[6]  Stephan Günnemann,et al.  NetGAN: Generating Graphs via Random Walks , 2018, ICML.

[7]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[8]  Regina Barzilay,et al.  Junction Tree Variational Autoencoder for Molecular Graph Generation , 2018, ICML.

[9]  Raymond Y. K. Lau,et al.  Least Squares Generative Adversarial Networks , 2016, 2017 IEEE International Conference on Computer Vision (ICCV).

[10]  Steven Skiena,et al.  Syntax-Directed Variational Autoencoder for Structured Data , 2018, ICLR.

[11]  Wei Lu,et al.  Deep Neural Networks for Learning Graph Representations , 2016, AAAI.

[12]  Albert-László Barabási,et al.  Statistical mechanics of complex networks , 2001, ArXiv.

[13]  Mathias Niepert,et al.  Learning Convolutional Neural Networks for Graphs , 2016, ICML.

[14]  Shashi Shekhar,et al.  Spatiotemporal Data Mining: A Computational Perspective , 2015, ISPRS Int. J. Geo Inf..

[15]  Joan Bruna,et al.  Spectral Networks and Locally Connected Networks on Graphs , 2013, ICLR.

[16]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[17]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.

[18]  Ghassan Hamarneh,et al.  BrainNetCNN: Convolutional neural networks for brain networks; towards predicting neurodevelopment , 2017, NeuroImage.

[19]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[20]  Michalis Vazirgiannis,et al.  Kernel Graph Convolutional Neural Networks , 2017, ICANN.

[21]  Alán Aspuru-Guzik,et al.  Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules , 2016, ACS central science.

[22]  Alexei A. Efros,et al.  Image-to-Image Translation with Conditional Adversarial Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[23]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[24]  Yansong Feng,et al.  Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks , 2018, ArXiv.

[25]  Liang Zhao,et al.  A Generic Framework for Interesting Subspace Cluster Detection in Multi-attributed Networks , 2017, 2017 IEEE International Conference on Data Mining (ICDM).

[26]  Nikos Komodakis,et al.  GraphVAE: Towards Generation of Small Graphs Using Variational Autoencoders , 2018, ICANN.

[27]  Matt J. Kusner,et al.  Grammar Variational Autoencoder , 2017, ICML.

[28]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[29]  Abdolreza Mirzaei,et al.  Hierarchical graph embedding in vector space by graph pyramid , 2017, Pattern Recognit..

[30]  拓海 杉山,et al.  “Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks”の学習報告 , 2017 .

[31]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[32]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[33]  Richard S. Zemel,et al.  Gated Graph Sequence Neural Networks , 2015, ICLR.

[34]  Abhinav Gupta,et al.  Generative Image Modeling Using Style and Structure Adversarial Networks , 2016, ECCV.

[35]  F. Scarselli,et al.  A new model for learning in graph domains , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[36]  Razvan Pascanu,et al.  Learning Deep Generative Models of Graphs , 2018, ICLR 2018.

[37]  Jian Pei,et al.  Community Preserving Network Embedding , 2017, AAAI.

[38]  Bernt Schiele,et al.  Generative Adversarial Text to Image Synthesis , 2016, ICML.

[39]  Niloy Ganguly,et al.  Designing Random Graph Models Using Variational Autoencoders With Applications to Chemical Design , 2018, ArXiv.

[40]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[41]  Simon Osindero,et al.  Conditional Generative Adversarial Nets , 2014, ArXiv.

[42]  Samuel S. Schoenholz,et al.  Neural Message Passing for Quantum Chemistry , 2017, ICML.