Hierarchical Attention Flow for Multiple-Choice Reading Comprehension
暂无分享,去创建一个
Furu Wei | Ting Liu | Bing Qin | Haichao Zhu | Ting Liu | Furu Wei | Bing Qin | Haichao Zhu
[1] Philip Bachman,et al. A Parallel-Hierarchical Model for Machine Comprehension on Sparse Data , 2016, ACL.
[2] Wenpeng Yin,et al. Attention-Based Convolutional Neural Network for Machine Comprehension , 2016, ArXiv.
[3] Philip Bachman,et al. Iterative Alternating Neural Attention for Machine Reading , 2016, ArXiv.
[4] Ming Zhou,et al. Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.
[5] Yelong Shen,et al. ReasoNet: Learning to Stop Reading in Machine Comprehension , 2016, CoCo@NIPS.
[6] David A. McAllester,et al. Who did What: A Large-Scale Person-Centered Cloze Dataset , 2016, EMNLP.
[7] Rudolf Kadlec,et al. Text Understanding with the Attention Sum Reader Network , 2016, ACL.
[8] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[9] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[10] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[11] Navdeep Jaitly,et al. Pointer Networks , 2015, NIPS.
[12] Pengtao Xie,et al. A Constituent-Centric Neural Architecture for Reading Comprehension , 2017, ACL.
[13] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[14] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[15] Li-Rong Dai,et al. Exploring Question Understanding and Adaptation in Neural-Network-Based Question Answering , 2017, ArXiv.
[16] Ruslan Salakhutdinov,et al. Gated-Attention Readers for Text Comprehension , 2016, ACL.
[17] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[18] Eric P. Xing,et al. Learning Answer-Entailing Structures for Machine Comprehension , 2015, ACL.
[19] Jianfeng Gao,et al. A Human Generated MAchine Reading COmprehension Dataset , 2018 .
[20] Ting Liu,et al. Consensus Attention-based Neural Networks for Chinese Reading Comprehension , 2016, COLING.
[21] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[22] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[23] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[24] Guokun Lai,et al. RACE: Large-scale ReAding Comprehension Dataset From Examinations , 2017, EMNLP.
[25] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[26] Regina Barzilay,et al. Machine Comprehension with Discourse Relations , 2015, ACL.
[27] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.
[28] Zhen-Hua Ling,et al. Enhanced LSTM for Natural Language Inference , 2016, ACL.
[29] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[30] Danqi Chen,et al. A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task , 2016, ACL.
[31] Andreas Vlachos,et al. A Strong Lexical Matching Method for the Machine Comprehension Test , 2015, EMNLP.
[32] Jason Weston,et al. The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations , 2015, ICLR.
[33] Richard Socher,et al. Dynamic Coattention Networks For Question Answering , 2016, ICLR.
[34] David A. McAllester,et al. Machine Comprehension with Syntax, Frames, and Semantics , 2015, ACL.