Reading Customer Reviews to Answer Product-related Questions

The e-commerce websites are ready to build the community question answering (CQA) service, as it can facilitate questioners (potential buyers) to obtain satisfying answers from experienced customers and furthermore stimulate consumption. Given that more than 50% product-related questions only anticipate a binary response (i.e., “Yes” or “No”), the research on productrelated question answering (PQA), which aims to automatically provide instant and correct replies to questioners, emerges rapidly. The mainstream approaches on PQA generally employ customer reviews as the evidence to help predict answers to the questions which are product-specific and concerned more about subjective personal experiences. However, the supportive features either extracted by heuristic rules or acquired from unsupervised manners are not able to perform well on PQA. In this paper, we contribute an end-to-end neural architecture directly fed by the raw text of productrelated questions and customer reviews to predict the answers. Concretely, it teaches machines to generate and to synthesize multiple question-aware review representations in a reading comprehension fashion to make the final decision. We also extract a real-world dataset crawled from 9 categories in Amazon.com for PQA to assess the performance of our neural reading architecture (NRA) and other mainstream approaches such as COR-L [12], MOQA [12], and AAP [21]. Experimental results show that our NRA sets up a new state-of-theart performance on this dataset, significantly outperforming existing algorithms.

[1]  M. Newman Power laws, Pareto distributions and Zipf's law , 2005 .

[2]  Julian J. McAuley,et al.  Addressing Complex and Subjective Product-Related Queries with Customer Reviews , 2015, WWW.

[3]  Sebastian Ruder,et al.  An overview of gradient descent optimization algorithms , 2016, Vestnik komp'iuternykh i informatsionnykh tekhnologii.

[4]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[5]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[6]  Luke S. Zettlemoyer,et al.  Deep Contextualized Word Representations , 2018, NAACL.

[7]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[8]  Chin-Yew Lin,et al.  ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.

[9]  Phil Blunsom,et al.  Teaching Machines to Read and Comprehend , 2015, NIPS.

[10]  ChengXiang Zhai,et al.  Lower-bounding term frequency normalization , 2011, CIKM '11.

[11]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[12]  Kuldip K. Paliwal,et al.  Bidirectional recurrent neural networks , 1997, IEEE Trans. Signal Process..

[13]  Yoon Kim,et al.  Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.

[14]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[15]  Jure Leskovec,et al.  Inferring Networks of Substitutable and Complementary Products , 2015, KDD.

[16]  Wai Lam,et al.  Review-Aware Answer Prediction for Product-Related Questions Incorporating Aspects , 2018, WSDM.

[17]  Miao Fan,et al.  Multi-Task Neural Learning Architecture for End-to-End Identification of Helpful Reviews , 2018, 2018 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM).

[18]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[19]  Xiang Zhang,et al.  Character-level Convolutional Networks for Text Classification , 2015, NIPS.

[20]  Geoffrey E. Hinton,et al.  Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.

[21]  Yoshua Bengio,et al.  Convolutional networks for images, speech, and time series , 1998 .