Improving Graph Neural Network by Filter Preprocessing

For graph-based semi-supervised learning, Graph Convolutional Networks (GCNs) and their variants has shown outstanding results and gained wide attention. Several works on analyzing GCNs from the perspective of spectral graph theory shows that GCNs has the function of low-pass filtering on certain learning tasks, which enables us to have a deeper understanding of GCN. However, GCN achieves the filtering effect through layers of matrix multiplication, so one cannot flexibly control the filtering process. In this paper, we propose to preprocess the graph-structured data by using low-pass filters explicitly before network training and prediction. We conduct experiments on the citation network datasets. Our preliminary results show that, the filter preprocessing step can effectively improve the predicative accuracy of common neural networks or graph neural networks in graph-based semi-supervised learning.

[1]  Pascal Frossard,et al.  Chebyshev polynomial approximation for distributed signal processing , 2011, 2011 International Conference on Distributed Computing in Sensor Systems and Workshops (DCOSS).

[2]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[3]  Takanori Maehara,et al.  Revisiting Graph Neural Networks: All We Have is Low-Pass Filters , 2019, ArXiv.

[4]  Xiao Wang,et al.  Beyond Low-frequency Information in Graph Convolutional Networks , 2021, AAAI.

[5]  Lise Getoor,et al.  Collective Classification in Network Data , 2008, AI Mag..

[6]  Christopher D. Manning,et al.  Graph Convolution over Pruned Dependency Trees Improves Relation Extraction , 2018, EMNLP.

[7]  Kilian Q. Weinberger,et al.  Simplifying Graph Convolutional Networks , 2019, ICML.

[8]  Xiao-Ming Wu,et al.  Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning , 2018, AAAI.

[9]  Paulo Gonçalves,et al.  Design of graph filters and filterbanks , 2017, ArXiv.

[10]  Khalil Sima'an,et al.  Graph Convolutional Encoders for Syntax-aware Neural Machine Translation , 2017, EMNLP.

[11]  Razvan Pascanu,et al.  A simple neural network module for relational reasoning , 2017, NIPS.

[12]  Joan Bruna,et al.  Few-Shot Learning with Graph Neural Networks , 2017, ICLR.

[13]  Shuiwang Ji,et al.  Towards Deeper Graph Neural Networks , 2020, KDD.

[14]  Jure Leskovec,et al.  How Powerful are Graph Neural Networks? , 2018, ICLR.

[15]  Pierre Vandergheynst,et al.  Wavelets on Graphs via Spectral Graph Theory , 2009, ArXiv.

[16]  Pascal Frossard,et al.  The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains , 2012, IEEE Signal Processing Magazine.

[17]  Kristina Lerman,et al.  MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing , 2019, ICML.

[18]  José M. F. Moura,et al.  Discrete Signal Processing on Graphs , 2012, IEEE Transactions on Signal Processing.

[19]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[20]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[21]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.