A Transformer-based Embedding Model for Personalized Product Search
暂无分享,去创建一个
[1] W. Bruce Croft,et al. A Study of Context Dependencies in Multi-page Product Search , 2019, CIKM.
[2] Jamie Callan,et al. Deeper Text Understanding for IR with Contextual Neural Language Modeling , 2019, SIGIR.
[3] Mohan S. Kankanhalli,et al. Attentive Long Short-Term Preference Modeling for Personalized Product Search , 2018, ACM Trans. Inf. Syst..
[4] W. Bruce Croft,et al. A Zero Attention Model for Personalized Product Search , 2019, CIKM.
[5] Jure Leskovec,et al. Inferring Networks of Substitutable and Complementary Products , 2015, KDD.
[6] Shubhra Kanti Karmaker Santu,et al. On Application of Learning to Rank for E-Commerce Search , 2017, SIGIR.
[7] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[8] M. de Rijke,et al. Learning Latent Vector Spaces for Product Search , 2016, CIKM.
[9] W. Bruce Croft,et al. Conversational Product Search Based on Negative Feedback , 2019, CIKM.
[10] Uzay Kaymak,et al. Facet selection algorithms for web product search , 2013, CIKM.
[11] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[12] Kyunghyun Cho,et al. Passage Re-ranking with BERT , 2019, ArXiv.
[13] ChengXiang Zhai,et al. A probabilistic mixture model for mining and analyzing product search log , 2013, CIKM.
[14] W. Bruce Croft,et al. Learning a Hierarchical Embedding Model for Personalized Product Search , 2017, SIGIR.