暂无分享,去创建一个
[1] Wentao Ma,et al. HFL-RC System at SemEval-2018 Task 11: Hybrid Multi-Aspects Model for Commonsense Reading Comprehension , 2018, ArXiv.
[2] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[3] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[4] Danqi Chen,et al. A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task , 2016, ACL.
[5] Yuxing Peng,et al. Reinforced Mnemonic Reader for Machine Comprehension , 2017 .
[6] Xiaodong Liu,et al. Towards Human-level Machine Reading Comprehension: Reasoning and Inference with Multiple Strategies , 2017, ArXiv.
[7] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[8] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[9] Wei Zhao,et al. Yuanfudao at SemEval-2018 Task 11: Three-way Attention and Relational Knowledge for Commonsense Machine Comprehension , 2018, SemEval@NAACL-HLT.
[10] Mitesh M. Khapra,et al. ElimiNet: A Model for Eliminating Options for Reading Comprehension with Multiple Choice Questions , 2018, IJCAI.
[11] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[12] Ming Zhou,et al. Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.
[13] Yelong Shen,et al. FusionNet: Fusing via Fully-Aware Attention with Application to Machine Comprehension , 2017, ICLR.
[14] Quoc V. Le,et al. QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension , 2018, ICLR.
[15] Rudolf Kadlec,et al. Text Understanding with the Attention Sum Reader Network , 2016, ACL.
[16] Jürgen Schmidhuber,et al. Framewise phoneme classification with bidirectional LSTM and other neural network architectures , 2005, Neural Networks.
[17] Ruslan Salakhutdinov,et al. Gated-Attention Readers for Text Comprehension , 2016, ACL.
[18] Tsuyoshi Murata,et al. {m , 1934, ACML.
[19] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[20] Jürgen Schmidhuber,et al. Highway Networks , 2015, ArXiv.
[21] J. Koenderink. Q… , 2014, Les noms officiels des communes de Wallonie, de Bruxelles-Capitale et de la communaute germanophone.
[22] Furu Wei,et al. Hierarchical Attention Flow for Multiple-Choice Reading Comprehension , 2018, AAAI.
[23] Yuan Yu,et al. TensorFlow: A system for large-scale machine learning , 2016, OSDI.
[24] Steven Bird,et al. NLTK: The Natural Language Toolkit , 2002, ACL.
[25] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[26] Guokun Lai,et al. RACE: Large-scale ReAding Comprehension Dataset From Examinations , 2017, EMNLP.
[27] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[28] Richard Socher,et al. Dynamic Coattention Networks For Question Answering , 2016, ICLR.
[29] Jason Weston,et al. The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations , 2015, ICLR.
[30] Simon Ostermann,et al. SemEval-2018 Task 11: Machine Comprehension Using Commonsense Knowledge , 2018, *SEMEVAL.