Query and Answer Expansion from Conversation History
暂无分享,去创建一个
Jimmy J. Lin | Sheng-Chieh Lin | Ming-Feng Tsai | Jimmy Lin | Chuan-Ju Wang | Jheng-Hong Yang | Sheng-Chieh Lin | Ming-Feng Tsai | Chuan-Ju Wang | Jheng-Hong Yang
[1] Jimmy J. Lin,et al. Anserini , 2018, Journal of Data and Information Quality.
[2] Jimmy J. Lin,et al. Document Expansion by Query Prediction , 2019, ArXiv.
[3] Jason Weston,et al. Reading Wikipedia to Answer Open-Domain Questions , 2017, ACL.
[4] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[5] Danqi Chen,et al. CoQA: A Conversational Question Answering Challenge , 2018, TACL.
[6] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[7] Kyunghyun Cho,et al. Passage Re-ranking with BERT , 2019, ArXiv.
[8] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[9] W. Bruce Croft,et al. BERT with History Answer Embedding for Conversational Question Answering , 2019, SIGIR.
[10] Eunsol Choi,et al. QuAC: Question Answering in Context , 2018, EMNLP.
[11] Kevin Gimpel,et al. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations , 2019, ICLR.