暂无分享,去创建一个
Deng Cai | Xiaofei He | Zhou Zhao | Bin Cao | Zheqian Chen | Rongqin Yang | Xiaofei He | Deng Cai | Zhou Zhao | Bin Cao | Rongqin Yang | Zheqian Chen
[1] James M. Lucas,et al. Exponentially weighted moving average control schemes: Properties and enhancements , 1990 .
[2] Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.
[3] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[4] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[5] Rudolf Kadlec,et al. Text Understanding with the Attention Sum Reader Network , 2016, ACL.
[6] Ye Yuan,et al. Words or Characters? Fine-grained Gating for Reading Comprehension , 2016, ICLR.
[7] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.
[8] Jianfeng Gao,et al. A Human Generated MAchine Reading COmprehension Dataset , 2018 .
[9] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[10] Navdeep Jaitly,et al. Pointer Networks , 2015, NIPS.
[11] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[12] Danqi Chen,et al. A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task , 2016, ACL.
[13] Deng Cai,et al. MEMEN: Multi-layer Embedding with Memory Networks for Machine Comprehension , 2017, ArXiv.
[15] Yelong Shen,et al. ReasoNet: Learning to Stop Reading in Machine Comprehension , 2016, CoCo@NIPS.
[16] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[17] Eunsol Choi,et al. TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension , 2017, ACL.
[18] Peter Clark,et al. Modeling Biological Processes for Reading Comprehension , 2014, EMNLP.
[19] Jason Weston,et al. The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations , 2015, ICLR.
[20] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[21] Samuel R. Bowman,et al. Ruminating Reader: Reasoning with Gated Multi-hop Attention , 2017, QA@ACL.
[22] Jason Weston,et al. Reading Wikipedia to Answer Open-Domain Questions , 2017, ACL.
[23] Ruslan Salakhutdinov,et al. Gated-Attention Readers for Text Comprehension , 2016, ACL.
[24] Yang Yu,et al. End-to-End Reading Comprehension with Dynamic Answer Chunk Ranking , 2016, ArXiv.
[25] Philip Bachman,et al. Iterative Alternating Neural Attention for Machine Reading , 2016, ArXiv.
[26] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[27] Richard Socher,et al. Dynamic Coattention Networks For Question Answering , 2016, ICLR.
[28] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[29] Xipeng Qiu,et al. Mnemonic Reader: Machine Comprehension with Iterative Aligning and Multi-hop Answer Pointing , 2017 .
[30] Gabriella Vigliocco,et al. Lexical surprisal as a general predictor of reading time , 2012, EACL.
[31] Rui Liu,et al. Structural Embedding of Syntactic Trees for Machine Comprehension , 2017, EMNLP.
[32] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[33] Steven Bird,et al. NLTK: The Natural Language Toolkit , 2002, ACL.
[34] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[35] Richard Socher,et al. Ask Me Anything: Dynamic Memory Networks for Natural Language Processing , 2015, ICML.
[36] Ming Zhou,et al. Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.