COIL: Revisit Exact Lexical Match in Information Retrieval with Contextualized Inverted List
暂无分享,去创建一个
Luyu Gao | Zhuyun Dai | Jamie Callan | Jamie Callan | Zhuyun Dai | Luyu Gao
[1] W. Bruce Croft,et al. A Deep Relevance Matching Model for Ad-hoc Retrieval , 2016, CIKM.
[2] Danqi Chen,et al. Dense Passage Retrieval for Open-Domain Question Answering , 2020, EMNLP.
[3] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[4] John D. Lafferty,et al. Document Language Models, Query Models, and Risk Minimization for Information Retrieval , 2001, SIGIR Forum.
[5] Zhuyun Dai,et al. Context-Aware Sentence/Passage Term Importance Estimation For First Stage Retrieval , 2019, ArXiv.
[6] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[7] Sanjiv Kumar,et al. Accelerating Large-Scale Inference with Anisotropic Vector Quantization , 2019, ICML.
[8] Luyu Gao,et al. Modularized Transfomer-based Ranking Framework , 2020, EMNLP.
[9] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[10] Thomas Wolf,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[11] Jacob Eisenstein,et al. Sparse, Dense, and Attentional Representations for Text Retrieval , 2021, Transactions of the Association for Computational Linguistics.
[12] Zhiyuan Liu,et al. End-to-End Neural Ad-hoc Ranking with Kernel Pooling , 2017, SIGIR.
[13] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[14] D. Cheriton. From doc2query to docTTTTTquery , 2019 .
[15] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[16] Luyu Gao,et al. Rethink Training of BERT Rerankers in Multi-Stage Retrieval Pipeline , 2021, ECIR.
[17] Benjamin Van Durme,et al. Complement Lexical Retrieval Model with Semantic Residual Embeddings , 2021, ECIR.
[18] W. Bruce Croft,et al. A Markov random field model for term dependencies , 2005, SIGIR '05.
[19] James P. Callan,et al. Context-Aware Document Term Weighting for Ad-Hoc Search , 2020, WWW.
[20] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[21] Jason Weston,et al. Poly-encoders: Architectures and Pre-training Strategies for Fast and Accurate Multi-sentence Scoring , 2020, ICLR.
[22] Raffaele Perego,et al. Efficient Document Re-Ranking for Transformers by Precomputing Term Representations , 2020, SIGIR.
[23] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[24] Stephen E. Robertson,et al. Some simple effective approximations to the 2-Poisson model for probabilistic weighted retrieval , 1994, SIGIR '94.
[25] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[26] Larry P. Heck,et al. Learning deep structured semantic models for web search using clickthrough data , 2013, CIKM.
[27] Jeff Johnson,et al. Billion-Scale Similarity Search with GPUs , 2017, IEEE Transactions on Big Data.
[28] Ye Li,et al. Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text Retrieval , 2020, ArXiv.
[29] Nick Craswell,et al. Query Expansion with Locally-Trained Word Embeddings , 2016, ACL.
[30] Jamie Callan,et al. Deeper Text Understanding for IR with Contextual Neural Language Modeling , 2019, SIGIR.
[31] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[32] Mandar Mitra,et al. Word Embedding based Generalized Language Model for Information Retrieval , 2015, SIGIR.
[33] Bhaskar Mitra,et al. Overview of the TREC 2019 deep learning track , 2020, ArXiv.
[34] Kyunghyun Cho,et al. Passage Re-ranking with BERT , 2019, ArXiv.
[35] M. Zaharia,et al. ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT , 2020, SIGIR.