Attention-based Graph Convolutional Network for Recommendation System

Matrix completion with rating data and auxiliary information for users and items is a challenging task in recommendation systems. In this paper, we propose an end-to-end architecture named Attention-based Graph Convolutional Network (AGCN) to embed both rating data and auxiliary information in a unified space, and subsequently learn low-rank dense representations via graph convolutional networks and attention layers. Compared to previous work, AGCN reduces computational complexity with Chebyshev polynomial graph filters. The introduced attention layer, which encourages weighing the neighbor information to learn more expressive structural graph representations, can improve the prediction accuracy, and lead to faster and more stable convergence. Experimental results show that our model can perform better and converge faster than current state-of-the-art methods on the real-world MovieLens and Flixster datasets.

[1]  Yoshua Bengio,et al.  Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.

[2]  Chong Wang,et al.  Attention-based Graph Neural Network for Semi-supervised Learning , 2018, ArXiv.

[3]  Antonio G. Marques,et al.  Rating Prediction via Graph Signal Processing , 2018, IEEE Transactions on Signal Processing.

[4]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[5]  Jose M. F. Moura,et al.  Representation and processing of massive data sets with irregular structure ] Big Data Analysis with Signal Processing on Graphs , 2022 .

[6]  Alex Graves,et al.  Neural Turing Machines , 2014, ArXiv.

[7]  Chao Liu,et al.  Recommender systems with social regularization , 2011, WSDM '11.

[8]  Kurt Mehlhorn,et al.  Weisfeiler-Lehman Graph Kernels , 2011, J. Mach. Learn. Res..

[9]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[10]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[11]  Martín Abadi,et al.  TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems , 2016, ArXiv.

[12]  Tat-Seng Chua,et al.  Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks , 2017, IJCAI.

[13]  Max Welling,et al.  Graph Convolutional Matrix Completion , 2017, ArXiv.

[14]  Xavier Bresson,et al.  Deep Geometric Matrix Completion: A New Way for Recommender Systems , 2018, 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[15]  Xavier Bresson,et al.  Geometric Matrix Completion with Recurrent Multi-Graph Neural Networks , 2017, NIPS.

[16]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[17]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[18]  Pradeep Ravikumar,et al.  Collaborative Filtering with Graph Information: Consistency and Scalable Methods , 2015, NIPS.

[19]  Thomas S. Huang,et al.  Graph Regularized Nonnegative Matrix Factorization for Data Representation. , 2011, IEEE transactions on pattern analysis and machine intelligence.

[20]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[21]  Hsinchun Chen,et al.  Recommendation as link prediction in bipartite graphs: A graph kernel-based machine learning approach , 2013, Decis. Support Syst..

[22]  Joan Bruna,et al.  Spectral Networks and Locally Connected Networks on Graphs , 2013, ICLR.

[23]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.