Training Robust Graph Neural Networks with Topology Adaptive Edge Dropping

Graph neural networks (GNNs) are processing architectures that exploit graph structural information to model representations from network data. Despite their success, GNNs suffer from sub-optimal generalization performance given limited training data, referred to as over-fitting. This paper proposes Topology Adaptive Edge Dropping (TADropEdge) method as an adaptive data augmentation technique to improve generalization performance and learn robust GNN models. We start by explicitly analyzing how random edge dropping increases the data diversity during training, while indicating i.i.d. edge dropping does not account for graph structural information and could result in noisy augmented data degrading performance. To overcome this issue, we consider graph connectivity as the key property that captures graph topology. TADropEdge incorporates this factor into random edge dropping such that the edge-dropped subgraphs maintain similar topology as the underlying graph, yielding more satisfactory data augmentation. In particular, TADropEdge first leverages the graph spectrum to assign proper weights to graph edges, which represent their criticality for establishing the graph connectivity. It then normalizes the edge weights and drops graph edges adaptively based on their normalized weights. Besides improving generalization performance, TADropEdge reduces variance for efficient training and can be applied as a generic method modular to different GNN models. Intensive experiments on real-life and synthetic datasets corroborate theory and verify the effectiveness of the proposed method.

[1]  Xiao-Ming Wu,et al.  Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning , 2018, AAAI.

[2]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[3]  Linton C. Freeman,et al.  Carnegie Mellon: Journal of Social Structure: Visualizing Social Networks Visualizing Social Networks , 2022 .

[4]  Cao Xiao,et al.  FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling , 2018, ICLR.

[5]  Taiji Suzuki,et al.  On Asymptotic Behaviors of Graph CNNs from Dynamical Systems Perspective , 2019, ArXiv.

[6]  Antonio G. Marques,et al.  Convolutional Neural Network Architectures for Signals Supported on Graphs , 2018, IEEE Transactions on Signal Processing.

[7]  Bo Zong,et al.  Robust Graph Representation Learning via Neural Sparsification , 2020, ICML.

[8]  Fernando Gama,et al.  Stability Properties of Graph Neural Networks , 2019, IEEE Transactions on Signal Processing.

[9]  Kilian Q. Weinberger,et al.  Simplifying Graph Convolutional Networks , 2019, ICML.

[10]  F. Richard Yu,et al.  Wireless Network Virtualization: A Survey, Some Research Issues and Challenges , 2015, IEEE Communications Surveys & Tutorials.

[11]  Brian M. Sadler,et al.  Network connectivity assessment and improvement through relay node deployment , 2017, Theor. Comput. Sci..

[12]  Yann LeCun,et al.  Spectral Networks and Deep Locally Connected Networks on Graphs , 2014 .

[13]  Cyrus Shahabi,et al.  Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting , 2017, ICLR.

[14]  Charu C. Aggarwal,et al.  Learning Deep Network Representations with Adversarially Regularized Autoencoders , 2018, KDD.

[15]  Alex Fout,et al.  Protein Interface Prediction using Graph Convolutional Networks , 2017, NIPS.

[16]  Junzhou Huang,et al.  Adaptive Sampling Towards Fast Graph Representation Learning , 2018, NeurIPS.

[17]  Subhrajit Bhattacharya On Some Bounds on the Perturbation of Invariant Subspaces of Normal Matrices with Application to a Graph Connection Problem , 2021 .

[18]  F. Scarselli,et al.  A new model for learning in graph domains , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[19]  Graham Cormode,et al.  Node Classification in Social Networks , 2011, Social Network Data Analytics.

[20]  Fernando Gama,et al.  Wide and Deep Graph Neural Networks with Distributed Online Learning , 2020, IEEE International Conference on Acoustics, Speech, and Signal Processing.

[21]  Guihai Chen,et al.  Dual Graph Attention Networks for Deep Latent Representation of Multifaceted Social Effects in Recommender Systems , 2019, WWW.

[22]  Sergey Ioffe,et al.  Rethinking the Inception Architecture for Computer Vision , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[23]  Jure Leskovec,et al.  Graph Convolutional Neural Networks for Web-Scale Recommender Systems , 2018, KDD.

[24]  Samuel S. Schoenholz,et al.  Neural Message Passing for Quantum Chemistry , 2017, ICML.

[25]  Pierre Vandergheynst,et al.  Graph Signal Processing: Overview, Challenges, and Applications , 2017, Proceedings of the IEEE.

[26]  Jie Tang,et al.  Name Disambiguation in AMiner: Clustering, Maintenance, and Human in the Loop. , 2018, KDD.

[27]  Jure Leskovec,et al.  How Powerful are Graph Neural Networks? , 2018, ICLR.

[28]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[29]  Ryan A. Rossi,et al.  Graph Classification using Structural Attention , 2018, KDD.

[30]  Rajgopal Kannan,et al.  GraphSAINT: Graph Sampling Based Inductive Learning Method , 2019, ICLR.

[31]  Razvan Pascanu,et al.  Interaction Networks for Learning about Objects, Relations and Physics , 2016, NIPS.

[32]  Pierre Vandergheynst,et al.  Wavelets on Graphs via Spectral Graph Theory , 2009, ArXiv.

[33]  Fernando Gama,et al.  Stability of Graph Neural Networks to Relative Perturbations , 2019, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[34]  Xing Xie,et al.  Session-based Recommendation with Graph Neural Networks , 2018, AAAI.

[35]  Tingyang Xu,et al.  DropEdge: Towards Deep Graph Convolutional Networks on Node Classification , 2020, ICLR.

[36]  Zhan Gao,et al.  Resource Allocation via Graph Neural Networks in Free Space Optical Fronthaul Networks , 2020, GLOBECOM 2020 - 2020 IEEE Global Communications Conference.

[37]  Pierre Vandergheynst,et al.  Spectrum-Adapted Tight Graph Wavelet and Vertex-Frequency Frames , 2013, IEEE Transactions on Signal Processing.

[38]  Pascal Frossard,et al.  Clustering With Multi-Layer Graphs: A Spectral Perspective , 2011, IEEE Transactions on Signal Processing.

[39]  Rick S. Blum,et al.  Inter-Cluster Transmission Control Using Graph Modal Barriers , 2021, IEEE Transactions on Signal and Information Processing over Networks.

[40]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[41]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.

[42]  Robert E. Tarjan,et al.  Efficiency of a Good But Not Linear Set Union Algorithm , 1972, JACM.

[43]  Thomas G. Dietterich Overfitting and undercomputing in machine learning , 1995, CSUR.

[44]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[45]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[46]  Yixin Chen,et al.  An End-to-End Deep Learning Architecture for Graph Classification , 2018, AAAI.

[47]  Lise Getoor,et al.  Collective Classification in Network Data , 2008, AI Mag..

[48]  Bo Zong,et al.  Learning to Drop: Robust Graph Neural Network via Topological Denoising , 2020, WSDM.

[49]  Andrew B. Kahng,et al.  New spectral methods for ratio cut partitioning and clustering , 1991, IEEE Trans. Comput. Aided Des. Integr. Circuits Syst..

[50]  Zhan Gao,et al.  Stochastic Graph Neural Networks , 2020, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).