暂无分享,去创建一个
Deng Cai | Xiaofei He | Zhou Zhao | Bin Cao | Boyuan Pan | Hao Li | Xiaofei He | Hao Li | Deng Cai | Zhou Zhao | Boyuan Pan | Bin Cao
[1] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[2] Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.
[3] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[4] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[5] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[6] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[7] Jason Weston,et al. End-To-End Memory Networks , 2015, NIPS.
[8] Navdeep Jaitly,et al. Pointer Networks , 2015, NIPS.
[9] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[10] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[11] Jason Weston,et al. The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations , 2015, ICLR.
[12] Sandro Pezzelle,et al. The LAMBADA dataset: Word prediction requiring a broad discourse context , 2016, ACL.
[13] David A. McAllester,et al. Who did What: A Large-Scale Person-Centered Cloze Dataset , 2016, EMNLP.
[14] Danqi Chen,et al. A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task , 2016, ACL.
[15] Yelong Shen,et al. ReasoNet: Learning to Stop Reading in Machine Comprehension , 2016, CoCo@NIPS.
[16] Zhiguo Wang,et al. Multi-Perspective Context Matching for Machine Comprehension , 2016, ArXiv.
[17] Rudolf Kadlec,et al. Text Understanding with the Attention Sum Reader Network , 2016, ACL.
[18] Ruslan Salakhutdinov,et al. Gated-Attention Readers for Text Comprehension , 2016, ACL.
[19] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[20] Yuxing Peng,et al. Mnemonic Reader for Machine Comprehension , 2017, ArXiv.
[21] Ming Zhou,et al. Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.
[22] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[23] Rui Liu,et al. Structural Embedding of Syntactic Trees for Machine Comprehension , 2017, EMNLP.
[24] Reinforced Mnemonic Reader for Machine Comprehension , 2017, 1705.02798.
[25] Richard Socher,et al. Dynamic Coattention Networks For Question Answering , 2016, ICLR.
[26] Eunsol Choi,et al. TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension , 2017, ACL.
[27] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.