暂无分享,去创建一个
Nan Duan | Ming Gong | Xuguang Wang | Linjun Shou | Daxin Jiang | Xuguang Wang | Nan Duan | Daxin Jiang | Linjun Shou | Ming Gong
[1] Peng Li,et al. Dataset and Neural Recurrent Sequence Labeling Model for Open-Domain Factoid Question Answering , 2016, ArXiv.
[2] Jiancheng Lv,et al. RikiNet: Reading Wikipedia Pages for Natural Question Answering , 2020, ACL.
[3] Kenton Lee,et al. A BERT Baseline for the Natural Questions , 2019, ArXiv.
[4] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[5] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[6] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[7] Nan Yang,et al. I Know There Is No Answer: Modeling Answer Validation for Machine Reading Comprehension , 2018, NLPCC.
[8] Kevin Gimpel,et al. Gaussian Error Linear Units (GELUs) , 2016 .
[9] Lukasz Kaiser,et al. Reformer: The Efficient Transformer , 2020, ICLR.
[10] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[11] Jason Weston,et al. Reading Wikipedia to Answer Open-Domain Questions , 2017, ACL.
[12] Christopher Clark,et al. Simple and Effective Multi-Paragraph Reading Comprehension , 2017, ACL.
[13] R'emi Louf,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[14] Ming-Wei Chang,et al. Natural Questions: A Benchmark for Question Answering Research , 2019, TACL.
[15] Wei Wang,et al. Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering , 2018, ACL.
[16] Furu Wei,et al. Read + Verify: Machine Reading Comprehension with Unanswerable Questions , 2018, AAAI.
[17] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[18] Percy Liang,et al. Know What You Don’t Know: Unanswerable Questions for SQuAD , 2018, ACL.
[19] Hwee Tou Ng,et al. A Nil-Aware Answer Extraction Framework for Question Answering , 2018, EMNLP.
[20] Geoffrey E. Hinton,et al. Distilling the Knowledge in a Neural Network , 2015, ArXiv.
[21] Arman Cohan,et al. Longformer: The Long-Document Transformer , 2020, ArXiv.
[22] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[23] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[24] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[25] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.