暂无分享,去创建一个
[1] Yoshua Bengio,et al. HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering , 2018, EMNLP.
[2] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[3] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[4] Guokun Lai,et al. RACE: Large-scale ReAding Comprehension Dataset From Examinations , 2017, EMNLP.
[5] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[6] Hai Zhao,et al. Dual Multi-head Co-attention for Multi-choice Reading Comprehension , 2020, ArXiv.
[7] Dilek Z. Hakkani-Tür,et al. MMM: Multi-stage Multi-task Learning for Multi-choice Reading Comprehension , 2020, AAAI.
[8] Alec Radford,et al. Improving Language Understanding by Generative Pre-Training , 2018 .
[9] Kevin Gimpel,et al. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations , 2019, ICLR.
[10] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[11] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[12] Claire Cardie,et al. DREAM: A Challenge Data Set and Models for Dialogue-Based Reading Comprehension , 2019, TACL.