AdaptiveGCN: Efficient GCN Through Adaptively Sparsifying Graphs

Graph Convolutional Networks (GCNs) have become the prevailing approach to efficiently learn representations from graph-structured data. Current GCN models adopt a neighborhood aggregation mechanism based on two primary operations, aggregation and combination. The workload of these two processes is determined by the input graph structure, making the graph input the bottleneck of processing GCN. Meanwhile, a large amount of task-irrelevant information in the graphs would hurt the model generalization performance. This brings the opportunity of studying how to remove the redundancy in the graphs. In this paper, we aim to accelerate GCN models by removing the task-irrelevant edges in the graph. We present AdaptiveGCN, an efficient and supervised graph sparsification framework. AdaptiveGCN adopts an edge predictor module to get edge selection strategies by learning the downstream task feedback signals for each GCN layer separately and adaptively in the training stage, then only inference with the selected edges in the test stage to speed up the GCN computation. The experimental results indicate that AdaptiveGCN could yield 43% (on CPU) and 39% (on GPU) GCN model speed-up averagely with comparable model performance on public graph learning benchmarks.

[1]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[2]  Qian Xu,et al.  Graph Random Neural Networks for Semi-Supervised Learning on Graphs , 2020, NeurIPS.

[3]  Jan Eric Lenssen,et al.  Fast Graph Representation Learning with PyTorch Geometric , 2019, ArXiv.

[4]  Ben Poole,et al.  Categorical Reparameterization with Gumbel-Softmax , 2016, ICLR.

[5]  Rajgopal Kannan,et al.  GraphSAINT: Graph Sampling Based Inductive Learning Method , 2019, ICLR.

[6]  S. Reinhardt,et al.  AWB-GCN: A Graph Convolutional Network Accelerator with Runtime Workload Rebalancing , 2019, 2020 53rd Annual IEEE/ACM International Symposium on Microarchitecture (MICRO).

[7]  Natalia Gimelshein,et al.  PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.

[8]  Jure Leskovec,et al.  How Powerful are Graph Neural Networks? , 2018, ICLR.

[9]  Bo Zong,et al.  Robust Graph Representation Learning via Neural Sparsification , 2020, ICML.

[10]  Yee Whye Teh,et al.  The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables , 2016, ICLR.

[11]  Dongrui Fan,et al.  HyGCN: A GCN Accelerator with Hybrid Architecture , 2020, 2020 IEEE International Symposium on High Performance Computer Architecture (HPCA).

[12]  Leonardo Neves,et al.  Data Augmentation for Graph Neural Networks , 2021, AAAI.

[13]  Philip S. Yu,et al.  A Comprehensive Survey on Graph Neural Networks , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[14]  Hongyuan Zha,et al.  Learning Robust Representations with Graph Denoising Policy Network , 2019, 2019 IEEE International Conference on Data Mining (ICDM).

[15]  J. Leskovec,et al.  Open Graph Benchmark: Datasets for Machine Learning on Graphs , 2020, NeurIPS.

[16]  Tingyang Xu,et al.  DropEdge: Towards Deep Graph Convolutional Networks on Node Classification , 2020, ICLR.

[17]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.