Context-aware Deep Model for Entity Recommendation in Search Engine at Alibaba

Entity recommendation, providing search users with an improved experience via assisting them in finding related entities for a given query, has become an indispensable feature of today's search engines. Existing studies typically only consider the queries with explicit entities. They usually fail to handle complex queries that without entities, such as "what food is good for cold weather", because their models could not infer the underlying meaning of the input text. In this work, we believe that contexts convey valuable evidence that could facilitate the semantic modeling of queries, and take them into consideration for entity recommendation. In order to better model the semantics of queries and entities, we learn the representation of queries and entities jointly with attentive deep neural networks. We evaluate our approach using large-scale, real-world search logs from a widely used commercial Chinese search engine. Our system has been deployed in ShenMa Search Engine and you can fetch it in UC Browser of Alibaba. Results from online A/B test suggest that the impression efficiency of click-through rate increased by 5.1% and page view increased by 5.5%.

[1]  Paul Covington,et al.  Deep Neural Networks for YouTube Recommendations , 2016, RecSys.

[2]  Wei Zhang,et al.  Improving Entity Recommendation with Search Log and Multi-Task Learning , 2018, IJCAI.

[3]  Xi Chen,et al.  Long-tail Relation Extraction via Knowledge Graph Embeddings and Graph Convolution Networks , 2019, NAACL.

[4]  Roi Blanco,et al.  Memory-based Recommendations of Entities for Web Search Users , 2016, CIKM.

[5]  Marcin Sydow,et al.  QBEES: query by entity examples , 2013, CIKM.

[6]  Matthijs Douze,et al.  FastText.zip: Compressing text classification models , 2016, ArXiv.

[7]  Guy Blanc,et al.  Adaptive Sampled Softmax with Kernel Based Sampling , 2017, ICML.

[8]  Roi Blanco,et al.  Entity Recommendations in Web Search , 2013, SEMWEB.

[9]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[10]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[11]  Minyi Guo,et al.  DKN: Deep Knowledge-Aware Network for News Recommendation , 2018, WWW.

[12]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[13]  Haifeng Wang,et al.  Generating Recommendation Evidence Using Translation Model , 2016, IJCAI.

[14]  Ramez Elmasri,et al.  GQBE: Querying knowledge graphs by example entity tuples , 2014, 2014 IEEE 30th International Conference on Data Engineering.

[15]  Themis Palpanas,et al.  Exemplar Queries: Give me an Example of What You Need , 2014, Proc. VLDB Endow..

[16]  Wei Zhang,et al.  Attention-Based Capsule Networks with Dynamic Routing for Relation Extraction , 2018, EMNLP.

[17]  Yiming Yang,et al.  XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.

[18]  Kun Gai,et al.  Learning Tree-based Deep Model for Recommender Systems , 2018, KDD.