Constructing Tree-based Index for Efficient and Effective Dense Retrieval

Recent studies have shown that Dense Retrieval (DR) techniques can significantly improve the performance of first-stage retrieval in IR systems. Despite its empirical effectiveness, the application of DR is still limited. In contrast to statistic retrieval models that rely on highly efficient inverted index solutions, DR models build dense embeddings that are difficult to be pre-processed with most existing search indexing systems. To avoid the expensive cost of brute-force search, the Approximate Nearest Neighbor (ANN) algorithm and corresponding indexes are widely applied to speed up the inference process of DR models. Unfortunately, while ANN can improve the efficiency of DR models, it usually comes with a significant price on retrieval performance. To solve this issue, we propose JTR, which stands for Joint optimization of TRee-based index and query encoding. Specifically, we design a new unified contrastive learning loss to train tree-based index and query encoder in an end-to-end manner. The tree-based negative sampling strategy is applied to make the tree have the maximum heap property, which supports the effectiveness of beam search well. Moreover, we treat the cluster assignment as an optimization problem to update the tree-based index that allows overlapped clustering. We evaluate JTR on numerous popular retrieval benchmarks. Experimental results show that JTR achieves better retrieval performance while retaining high system efficiency compared with widely-adopted baselines. It provides a potential solution to balance efficiency and effectiveness in neural retrieval system designs.

[1]  Xiaohui Xie,et al.  T2Ranking: A Large-scale Chinese Benchmark for Passage Ranking , 2023, SIGIR.

[2]  Qingyao Ai,et al.  Towards Better Web Search Performance: Pre-training, Fine-tuning and Learning to Rank , 2023, ArXiv.

[3]  Clayton D. Scott,et al.  IEEE Transactions on Pattern Analysis and Machine Intelligence , 2022, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Enhong Chen,et al.  Forest-based Deep Recommender , 2022, SIGIR.

[5]  Qi Zhang,et al.  A Neural Corpus Indexer for Document Retrieval , 2022, NeurIPS.

[6]  Ninh D. Pham,et al.  Falconn++: A Locality-sensitive Filtering Approach for Approximate Nearest Neighbor Search , 2022, NeurIPS.

[7]  Dawei Yin,et al.  Incorporating Explicit Knowledge in Pre-trained Language Models for Passage Re-ranking , 2022, SIGIR.

[8]  William W. Cohen,et al.  Transformer Memory as a Differentiable Search Index , 2022, NeurIPS.

[9]  Jiafeng Guo,et al.  Learning Discrete Representations via Constrained Clustering for Effective and Efficient Dense Retrieval , 2021, WSDM.

[10]  Defu Lian,et al.  Recommender Forest for Efficient Retrieval , 2022, NeurIPS.

[11]  M. Zhang,et al.  Enhance Performance of Ad-hoc Search via Prompt Learning , 2022, CCIR.

[12]  M. Zhang,et al.  Joint Optimization of Multi-vector Representation with Product Quantization , 2022, NLPCC.

[13]  Jingtao Zhan THUIR at the NTCIR-16 WWW-4 Task , 2022 .

[14]  Jiafeng Guo,et al.  Jointly Optimizing Query Encoder and Product Quantization to Improve Retrieval Performance , 2021, CIKM.

[15]  Inderjit S. Dhillon,et al.  Label Disentanglement in Partition-based Extreme Multilabel Classification , 2021, NeurIPS.

[16]  Torsten Suel,et al.  Learning Passage Impacts for Inverted Indexes , 2021, SIGIR.

[17]  Jiafeng Guo,et al.  Optimizing Dense Retrieval Model Training with Hard Negatives , 2021, SIGIR.

[18]  Hua Wu,et al.  RocketQA: An Optimized Training Approach to Dense Passage Retrieval for Open-Domain Question Answering , 2020, NAACL.

[19]  Jeff Johnson,et al.  Billion-Scale Similarity Search with GPUs , 2017, IEEE Transactions on Big Data.

[20]  Benjamin Van Durme,et al.  Complement Lexical Retrieval Model with Semantic Residual Embeddings , 2021, ECIR.

[21]  Jimmy J. Lin,et al.  Distilling Dense Representations for Ranking using Tightly-Coupled Teachers , 2020, ArXiv.

[22]  Ryan Cotterell,et al.  Best-First Beam Search , 2020, Transactions of the Association for Computational Linguistics.

[23]  Min Zhang,et al.  RepBERT: Contextualized Text Embeddings for First-Stage Retrieval , 2020, ArXiv.

[24]  Jian Xu,et al.  Learning Optimal Tree Models Under Beam Search , 2020, ICML.

[25]  M. Zaharia,et al.  ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT , 2020, SIGIR.

[26]  Danqi Chen,et al.  Dense Passage Retrieval for Open-Domain Question Answering , 2020, EMNLP.

[27]  Hannaneh Hajishirzi,et al.  Contextualized Sparse Representations for Real-Time Open-Domain Question Answering , 2019, Annual Meeting of the Association for Computational Linguistics.

[28]  Yury A. Malkov,et al.  Efficient and Robust Approximate Nearest Neighbor Search Using Hierarchical Navigable Small World Graphs , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[29]  Nick Craswell,et al.  O VERVIEW OF THE TREC 2019 DEEP LEARNING TRACK , 2020 .

[30]  Kun Gai,et al.  Joint Optimization of Tree-based Index and Deep Model for Recommender Systems , 2019, NeurIPS.

[31]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[32]  Kun Gai,et al.  Learning Tree-based Deep Model for Recommender Systems , 2018, KDD.

[33]  Frank Hutter,et al.  Fixing Weight Decay Regularization in Adam , 2017, ArXiv.

[34]  Jianfeng Gao,et al.  A Human Generated MAchine Reading COmprehension Dataset , 2018 .

[35]  Inderjit S. Dhillon,et al.  Non-exhaustive, Overlapping k-means , 2015, SDM.

[36]  Jian Sun,et al.  Optimized Product Quantization , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[37]  Abbreviazioni Periodici Giuridici N. D. I. , 2013 .

[38]  Victor S. Lempitsky,et al.  The Inverted Multi-Index , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[39]  Cordelia Schmid,et al.  Product Quantization for Nearest Neighbor Search , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[40]  Guillaume Cleuziou,et al.  An extended version of the k-means method for overlapping clustering , 2008, 2008 19th International Conference on Pattern Recognition.