暂无分享,去创建一个
Wei Li | Xiaodong Liu | Kevin Duh | Jianfeng Gao | Yuwei Fang | Aerin Kim
[1] Rich Caruana,et al. Multitask Learning , 1997, Machine-mediated learning.
[2] Xiaodong Liu,et al. Multi-task Learning with Sample Re-weighting for Machine Reading Comprehension , 2018, NAACL.
[3] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[4] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[5] Xiaodong Liu,et al. An Empirical Analysis of Multiple-Turn Reasoning Strategies in Reading Comprehension Tasks , 2017, IJCNLP.
[6] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[7] Xiaodong Liu,et al. Stochastic Answer Networks for Natural Language Inference , 2018, ArXiv.
[8] Furu Wei,et al. Read + Verify: Machine Reading Comprehension with Unanswerable Questions , 2018, AAAI.
[9] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.
[10] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[11] Bowen Zhou,et al. A Structured Self-attentive Sentence Embedding , 2017, ICLR.
[12] Xiaodong Liu,et al. Stochastic Answer Networks for Machine Reading Comprehension , 2017, ACL.
[13] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[14] Percy Liang,et al. Know What You Don’t Know: Unanswerable Questions for SQuAD , 2018, ACL.
[15] Quoc V. Le,et al. QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension , 2018, ICLR.
[16] Xiaodong Liu,et al. Representation Learning Using Multi-Task Deep Neural Networks for Semantic Classification and Information Retrieval , 2015, NAACL.
[17] Richard Socher,et al. Learned in Translation: Contextualized Word Vectors , 2017, NIPS.
[18] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[19] Xiaodong Liu,et al. Multi-Task Learning for Machine Reading Comprehension , 2018, ArXiv.