暂无分享,去创建一个
Quoc V. Le | Mohammad Norouzi | Rui Zhao | Adams Wei Yu | Minh-Thang Luong | David Dohan | Kai Chen | Kai Chen | Mohammad Norouzi | Minh-Thang Luong | David Dohan | Rui Zhao | A. Yu
[1] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[2] Martín Abadi,et al. TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems , 2016, ArXiv.
[3] David Berthelot,et al. WikiReading: A Novel Large-scale Language Understanding Task over Wikipedia , 2016, ACL.
[4] Christopher Clark,et al. Simple and Effective Multi-Paragraph Reading Comprehension , 2017, ACL.
[5] Jürgen Schmidhuber,et al. Highway Networks , 2015, ArXiv.
[6] Lukasz Kaiser,et al. Depthwise Separable Convolutions for Neural Machine Translation , 2017, ICLR.
[7] Ming Zhou,et al. Neural Question Generation from Text: A Preliminary Study , 2017, NLPCC.
[8] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[9] Yelong Shen,et al. ReasoNet: Learning to Stop Reading in Machine Comprehension , 2016, CoCo@NIPS.
[10] Tao Shen,et al. DiSAN: Directional Self-Attention Network for RNN/CNN-free Language Understanding , 2017, AAAI.
[11] Rui Liu,et al. Structural Embedding of Syntactic Trees for Machine Comprehension , 2017, EMNLP.
[12] Samuel R. Bowman,et al. Ruminating Reader: Reasoning with Gated Multi-hop Attention , 2017, QA@ACL.
[13] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[14] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[15] Yann Dauphin,et al. Convolutional Sequence to Sequence Learning , 2017, ICML.
[16] Zhiguo Wang,et al. Multi-Perspective Context Matching for Machine Comprehension , 2016, ArXiv.
[17] Kenton Lee,et al. Learning Recurrent Span Representations for Extractive Question Answering , 2016, ArXiv.
[18] Ming Zhou,et al. Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.
[19] Mirella Lapata,et al. Paraphrasing Revisited with Neural Machine Translation , 2017, EACL.
[20] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[21] Deng Cai,et al. MEMEN: Multi-layer Embedding with Memory Networks for Machine Comprehension , 2017, ArXiv.
[22] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.
[23] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[24] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[25] Richard Socher,et al. Dynamic Coattention Networks For Question Answering , 2016, ICLR.
[26] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[27] François Chollet,et al. Xception: Deep Learning with Depthwise Separable Convolutions , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[28] Yang Yu,et al. End-to-End Reading Comprehension with Dynamic Answer Chunk Ranking , 2016, ArXiv.
[29] Xiaodong Liu,et al. Stochastic Answer Networks for Machine Reading Comprehension , 2017, ACL.
[30] Ting Liu,et al. Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.
[31] Xiang Zhang,et al. Character-level Convolutional Networks for Text Classification , 2015, NIPS.
[32] Eunsol Choi,et al. TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension , 2017, ACL.
[33] Dirk Weissenborn,et al. Making Neural QA as Simple as Possible but not Simpler , 2017, CoNLL.
[34] Rico Sennrich,et al. Improving Neural Machine Translation Models with Monolingual Data , 2015, ACL.
[35] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[36] Mirella Lapata,et al. Learning to Paraphrase for Question Answering , 2017, EMNLP.
[37] John Miller,et al. Globally Normalized Reader , 2017, EMNLP.
[38] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[39] Kevin Gimpel,et al. Learning Paraphrastic Sentence Embeddings from Back-Translated Bitext , 2017, EMNLP.
[40] Yuxing Peng,et al. Reinforced Mnemonic Reader for Machine Comprehension , 2017 .
[41] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[42] Geoffrey E. Hinton,et al. Layer Normalization , 2016, ArXiv.
[43] Quoc V. Le,et al. Learning to Skim Text , 2017, ACL.
[44] Jason Weston,et al. The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations , 2015, ICLR.
[45] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[46] Kilian Q. Weinberger,et al. Deep Networks with Stochastic Depth , 2016, ECCV.
[47] Li-Rong Dai,et al. Exploring Question Understanding and Adaptation in Neural-Network-Based Question Answering , 2017, ArXiv.
[48] Percy Liang,et al. Adversarial Examples for Evaluating Reading Comprehension Systems , 2017, EMNLP.
[49] Jason Weston,et al. Reading Wikipedia to Answer Open-Domain Questions , 2017, ACL.