Improving Transformer-based Sequential Recommenders through Preference Editing

One of the key challenges in sequential recommendation (SR) is how to extract and represent user preferences. Traditional SR methods rely on the next item as the supervision signal to guide preference extraction and representation. We propose a novel learning strategy, named preference editing. The idea is to force the SR model to discriminate the common and unique preferences in different sequences of interactions between users and the recommender system. By doing so, the SR model is able to learn how to identify common and unique user preferences, and thereby do better user preference extraction and representation. We propose a transformer based SR model, named MrTransformer (Multi-preference Transformer), that concatenates some special tokens in front of the sequence to represent multiple user preferences and makes sure they capture different aspects through a preference coverage mechanism. Then, we devise a preference editing-based self-supervised learning mechanism for training MrTransformer that contains two main operations: preference separation and preference recombination. The former separates the common and unique user preferences for a given pair of sequences. The latter swaps the common preferences to obtain recombined user preferences for each sequence. Based on the preference separation and preference recombination operations, we define two types of self-supervised learning loss that require that the recombined preferences are similar to the original ones, and that the common preferences are close to each other. We carry out extensive experiments on two benchmark datasets. MrTransformer with preference editing significantly outperforms state-of-the-art SR methods in terms of Recall, MRR and NDCG. We find that long sequences whose user preferences are harder to extract and represent benefit most from preference editing.

[1]  Zhaochun Ren,et al.  Neural Attentive Session-based Recommendation , 2017, CIKM.

[2]  Omer Levy,et al.  BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension , 2019, ACL.

[3]  Hongxia Yang,et al.  Contrastive Learning for Debiased Candidate Generation in Large-Scale Recommender Systems , 2020, KDD.

[4]  Yanghua Xiao,et al.  Incorporating User Micro-behaviors and Item Knowledge into Multi-task Learning for Session-based Recommendation , 2020, SIGIR.

[5]  James Caverlee,et al.  Next-item Recommendation with Sequential Hypergraphs , 2020, SIGIR.

[6]  M. de Rijke,et al.  π-Net: A Parallel Information-sharing Network for Shared-account Cross-domain Sequential Recommendations , 2019, SIGIR.

[7]  Deqing Wang,et al.  Feature-level Deeper Self-Attention Network for Sequential Recommendation , 2019, IJCAI.

[8]  Jian Tang,et al.  Session-Based Social Recommendation via Dynamic Graph Attention Networks , 2019, WSDM.

[9]  Alexandros Karatzoglou,et al.  Personalizing Session-based Recommendations with Hierarchical Recurrent Neural Networks , 2017, RecSys.

[10]  Ming Yang,et al.  A Generic Network Compression Framework for Sequential Recommender Systems , 2020, SIGIR.

[11]  Ji-Rong Wen,et al.  S3-Rec: Self-Supervised Learning for Sequential Recommendation with Mutual Information Maximization , 2020, CIKM.

[12]  Oriol Vinyals,et al.  Representation Learning with Contrastive Predictive Coding , 2018, ArXiv.

[13]  M. de Rijke,et al.  Improving End-to-End Sequential Recommendations with Intent-aware Diversification , 2019, CIKM.

[14]  Jieqi Kang,et al.  Self-supervised Learning for Deep Models in Recommendations , 2020, ArXiv.

[15]  Xing Xie,et al.  Cross-domain novelty seeking trait mining for sequential recommendation , 2018, ArXiv.

[16]  Zhang Xiong,et al.  Contrastive Learning for Recommender System , 2021, ArXiv.

[17]  Peng Jiang,et al.  BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer , 2019, CIKM.

[18]  Jürgen Ziegler,et al.  Sequential User-based Recurrent Neural Network Recommendations , 2017, RecSys.

[19]  Ji-Rong Wen,et al.  Sequential Recommendation with Self-Attentive Multi-Adversarial Network , 2020, SIGIR.

[20]  Yongfeng Zhang,et al.  Sequential Recommendation with User Memory Networks , 2018, WSDM.

[21]  Alexandros Karatzoglou,et al.  Session-based Recommendations with Recurrent Neural Networks , 2015, ICLR.

[22]  Iyad Rahwan,et al.  Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm , 2017, EMNLP.

[23]  Jing Lin,et al.  FISSA: Fusing Item Similarity Models with Self-Attention Networks for Sequential Recommendation , 2020, RecSys.

[24]  Boi Faltings,et al.  ADER: Adaptively Distilled Exemplar Replay Towards Continual Learning for Session-based Recommendation , 2020, RecSys.

[25]  Xiangnan He,et al.  Future Data Helps Training: Modeling Future Contexts for Session-based Recommendation , 2020, WWW.

[26]  Sung Min Cho,et al.  MEANTIME: Mixture of Attention Mechanisms with Multi-temporal Embeddings for Sequential Recommendation , 2020, RecSys.

[27]  Kevin Gimpel,et al.  ALBERT: A Lite BERT for Self-supervised Learning of Language Representations , 2019, ICLR.

[28]  Xiaoguang Li,et al.  Non-invasive Self-attention for Side Information Fusion in Sequential Recommendation , 2021, AAAI.

[29]  Jianxun Lian,et al.  Self-supervised Graph Learning for Recommendation , 2020, SIGIR.

[30]  Xing Xie,et al.  Session-based Recommendation with Graph Neural Networks , 2018, AAAI.

[31]  Deqing Wang,et al.  Collaborative Self-Attention Network for Session-based Recommendation , 2020, IJCAI.

[32]  M. de Rijke,et al.  NLP4REC: The WSDM 2020 Workshop on Natural Language Processing for Recommendations , 2020, WSDM.

[33]  Alexandros Karatzoglou,et al.  Parallel Recurrent Neural Network Architectures for Feature-rich Session-based Recommendations , 2016, RecSys.

[34]  Qiang Liu,et al.  TAGNN: Target Attentive Graph Neural Networks for Session-based Recommendation , 2020, SIGIR.

[35]  Zheng Qin,et al.  Time Matters: Sequential Recommendation with Complex Temporal Information , 2020, SIGIR.

[36]  Zheng Wen,et al.  Optimal Greedy Diversity for Recommendation , 2015, IJCAI.

[37]  M. de Rijke,et al.  RepeatNet: A Repeat Aware Neural Recommendation Machine for Session-based Recommendation , 2018, AAAI.

[38]  Qiao Liu,et al.  STAMP: Short-Term Attention/Memory Priority Model for Session-based Recommendation , 2018, KDD.

[39]  Edward Y. Chang,et al.  Improving Sequential Recommendation with Knowledge-Enhanced Memory Networks , 2018, SIGIR.

[40]  Xiangliang Zhang,et al.  Self-Supervised Hypergraph Convolutional Networks for Session-based Recommendation , 2020, AAAI.

[41]  Chang Zhou,et al.  Disentangled Self-Supervision in Sequential Recommenders , 2020, KDD.

[42]  Yujie Wang,et al.  Time Interval Aware Self-Attention for Sequential Recommendation , 2020, WSDM.

[43]  Ke Wang,et al.  Personalized Top-N Sequential Recommendation via Convolutional Sequence Embedding , 2018, WSDM.

[44]  M. de Rijke,et al.  A Collaborative Session-based Recommendation Approach with Parallel Memory Modules , 2019, SIGIR.

[45]  Alexei A. Efros,et al.  Unsupervised Domain Adaptation through Self-Supervision , 2019, ArXiv.

[46]  M. de Rijke,et al.  Rethinking Item Importance in Session-based Recommendation , 2020, SIGIR.

[47]  Greg Linden,et al.  Amazon . com Recommendations Item-to-Item Collaborative Filtering , 2001 .

[48]  Kaiming He,et al.  Momentum Contrast for Unsupervised Visual Representation Learning , 2019, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[49]  Wei Wei,et al.  Global Context Enhanced Graph Neural Networks for Session-based Recommendation , 2020, SIGIR.

[50]  Zi Huang,et al.  GAG: Global Attributed Graph Neural Network for Streaming Session-based Recommendation , 2020, SIGIR.

[51]  Jinyun Fang,et al.  Session-based Recommendation with Hierarchical Leaping Networks , 2020, SIGIR.

[52]  Yu Fan,et al.  KERL: A Knowledge-Guided Reinforcement Learning Model for Sequential Recommendation , 2020, SIGIR.

[53]  Xiangliang Zhang,et al.  Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation , 2021, ArXiv.

[54]  Tat-Seng Chua,et al.  Neural Collaborative Filtering , 2017, WWW.

[55]  Xiangnan He,et al.  Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation , 2020, SIGIR.

[56]  Cho-Jui Hsieh,et al.  Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers , 2019, NeurIPS.

[57]  Alexander Kolesnikov,et al.  S4L: Self-Supervised Semi-Supervised Learning , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).

[58]  Jin Yu,et al.  Sentiment-guided Sequential Recommendation , 2020, SIGIR.

[59]  Hongning Wang,et al.  Déjà vu: A Contextualized Temporal Attention Mechanism for Sequential Recommendation , 2020, WWW.

[60]  Le Wu,et al.  Attentive Recurrent Social Recommendation , 2018, SIGIR.

[61]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[62]  Joemon M. Jose,et al.  Self-Supervised Reinforcement Learning for Recommender Systems , 2020, SIGIR.

[63]  M. de Rijke,et al.  An Intent-guided Collaborative Machine for Session-based Recommendation , 2020, SIGIR.

[64]  Anton van den Hengel,et al.  Image-Based Recommendations on Styles and Substitutes , 2015, SIGIR.

[65]  Ji-Rong Wen,et al.  Taxonomy-Aware Multi-Hop Reasoning Networks for Sequential Recommendation , 2019, WSDM.

[66]  Cho-Jui Hsieh,et al.  SSE-PT: Sequential Recommendation Via Personalized Transformer , 2020, RecSys.

[67]  Peijie Sun,et al.  Dual Learning for Explainable Recommendation: Towards Unifying User Preference Prediction and Review Generation , 2020, WWW.

[68]  Xiangyang Luo,et al.  A Review-Driven Neural Model for Sequential Recommendation , 2019, IJCAI.

[69]  Julian J. McAuley,et al.  Self-Attentive Sequential Recommendation , 2018, 2018 IEEE International Conference on Data Mining (ICDM).