Estimating Reactions and Recommending Products with Generative Models of Reviews

Traditional approaches to recommendation focus on learning from large volumes of historical feedback to estimate simple numerical quantities (Will a user click on a product? Make a purchase? etc.). Natural language approaches that model information like product reviews have proved to be incredibly useful in improving the performance of such methods, as reviews provide valuable auxiliary information that can be used to better estimate latent user preferences and item properties. In this paper, rather than using reviews as an inputs to a recommender system, we focus on generating reviews as the model’s output. This requires us to efficiently model text (at the character level) to capture the preferences of the user, the properties of the item being consumed, and the interaction between them (i.e., the user’s preference). We show that this can model can be used to (a) generate plausible reviews and estimate nuanced reactions; (b) provide personalized rankings of existing reviews; and (c) recommend existing products more effectively.

[1]  Jure Leskovec,et al.  Inferring Networks of Substitutable and Complementary Products , 2015, KDD.

[2]  Wojciech Zaremba,et al.  Recurrent Neural Network Regularization , 2014, ArXiv.

[3]  Louis-Philippe Morency,et al.  Affect-LM: A Neural Language Model for Customizable Affective Text Generation , 2017, ACL.

[4]  Lei Zheng,et al.  Joint Deep Modeling of Users and Items Using Reviews for Recommendation , 2017, WSDM.

[5]  Tat-Seng Chua,et al.  Neural Collaborative Filtering , 2017, WWW.

[6]  Lars Schmidt-Thieme,et al.  BPR: Bayesian Personalized Ranking from Implicit Feedback , 2009, UAI.

[7]  Samy Bengio,et al.  Show and tell: A neural image caption generator , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[8]  Mirella Lapata,et al.  Learning to Generate Product Reviews from Attributes , 2017, EACL.

[9]  Xiang Zhang,et al.  Text Understanding from Scratch , 2015, ArXiv.

[10]  Alex Graves,et al.  Generating Sequences With Recurrent Neural Networks , 2013, ArXiv.

[11]  Jie Zhang,et al.  TopicMF: Simultaneously Exploiting Ratings and Reviews for Recommendation , 2014, AAAI.

[12]  Dit-Yan Yeung,et al.  Collaborative Deep Learning for Recommender Systems , 2014, KDD.

[13]  Geoffrey E. Hinton,et al.  Generating Text with Recurrent Neural Networks , 2011, ICML.

[14]  Guang-Neng Hu,et al.  Integrating Reviews into Personalized Ranking for Cold Start Recommendation , 2017, PAKDD.

[15]  Steffen Rendle,et al.  Factorization Machines , 2010, 2010 IEEE International Conference on Data Mining.

[16]  Sharad Vikram,et al.  Capturing Meaning in Product Reviews with Character-Level Generative Text Models , 2015, ArXiv.

[17]  Ilya Sutskever,et al.  Learning to Generate Reviews and Discovering Sentiment , 2017, ArXiv.

[18]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[19]  Yonghui Wu,et al.  Exploring the Limits of Language Modeling , 2016, ArXiv.

[20]  Jure Leskovec,et al.  Hidden factors and hidden topics: understanding rating dimensions with review text , 2013, RecSys.

[21]  Julian J. McAuley,et al.  VBPR: Visual Bayesian Personalized Ranking from Implicit Feedback , 2015, AAAI.

[22]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[23]  Fei-Fei Li,et al.  Deep visual-semantic alignments for generating image descriptions , 2015, CVPR.

[24]  Alexander J. Smola,et al.  Joint Training of Ratings and Reviews with Recurrent Recommender Networks , 2016, ICLR.

[25]  Michael R. Lyu,et al.  Ratings meet reviews, a combined approach to recommend , 2014, RecSys '14.

[26]  Tat-Seng Chua,et al.  Fast Matrix Factorization for Online Recommendation with Implicit Feedback , 2016, SIGIR.

[27]  Piji Li,et al.  Neural Rating Regression with Abstractive Tips Generation for Recommendation , 2017, SIGIR.

[28]  William W. Cohen,et al.  TransNets: Learning to Transform for Recommendation , 2017, RecSys.

[29]  Eric P. Xing,et al.  Controllable Text Generation , 2017, ArXiv.