暂无分享,去创建一个
Richard Socher | Caiming Xiong | Nitish Shirish Keskar | Victor Zhong | R. Socher | N. Keskar | Caiming Xiong | Victor Zhong
[1] Ankur P. Parikh,et al. Multi-Mention Learning for Reading Comprehension with Neural Cascades , 2017, ICLR.
[2] Eunsol Choi,et al. Coarse-to-Fine Question Answering for Long Documents , 2016, ACL.
[3] Richard Socher,et al. DCN+: Mixed Objective and Deep Residual Coattention for Question Answering , 2017, ICLR.
[4] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[5] Sebastian Riedel,et al. Constructing Datasets for Multi-hop Reading Comprehension Across Documents , 2017, TACL.
[6] Furu Wei,et al. Read + Verify: Machine Reading Comprehension with Unanswerable Questions , 2018, AAAI.
[7] Danqi Chen,et al. A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task , 2016, ACL.
[8] Ani Nenkova,et al. Measuring Importance and Query Relevance in Topic-focused Multi-document Summarization , 2007, ACL.
[9] Claire Cardie,et al. A Sentence Compression Based Framework to Query-Focused Multi-Document Summarization , 2013, ACL.
[10] David Berthelot,et al. WikiReading: A Novel Large-scale Language Understanding Task over Wikipedia , 2016, ACL.
[11] Hoa Trang Dang,et al. Overview of DUC 2006 , 2006 .
[12] Christopher Clark,et al. Simple and Effective Multi-Paragraph Reading Comprehension , 2017, ACL.
[13] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[14] Jiasen Lu,et al. Hierarchical Question-Image Co-Attention for Visual Question Answering , 2016, NIPS.
[15] Ali Farhadi,et al. Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.
[16] Philip Bachman,et al. NewsQA: A Machine Comprehension Dataset , 2016, Rep4NLP@ACL.
[17] Dan Klein,et al. Constituency Parsing with a Self-Attentive Encoder , 2018, ACL.
[18] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[19] Nicola De Cao,et al. Question Answering by Reasoning Across Documents with Graph Convolutional Networks , 2018, NAACL.
[20] Slav Petrov,et al. Coarse-to-Fine Natural Language Processing , 2011, Theory and Applications of Natural Language Processing.
[21] Chengqi Zhang,et al. Reinforced Self-Attention Network: a Hybrid of Hard and Soft Attention for Sequence Modeling , 2018, IJCAI.
[22] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[23] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[24] Frank Hutter,et al. SGDR: Stochastic Gradient Descent with Warm Restarts , 2016, ICLR.
[25] Richard Socher,et al. A Neural Network for Factoid Question Answering over Paragraphs , 2014, EMNLP.
[26] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[27] Luke S. Zettlemoyer,et al. End-to-end Neural Coreference Resolution , 2017, EMNLP.
[28] Wei Zhang,et al. Evidence Aggregation for Answer Re-Ranking in Open-Domain Question Answering , 2017, ICLR.
[29] Richard Socher,et al. Global-Locally Self-Attentive Encoder for Dialogue State Tracking , 2018, ACL.
[30] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[31] Eunsol Choi,et al. TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension , 2017, ACL.
[32] Yi Yang,et al. WikiQA: A Challenge Dataset for Open-Domain Question Answering , 2015, EMNLP.
[33] Shuohang Wang,et al. Machine Comprehension Using Match-LSTM and Answer Pointer , 2016, ICLR.
[34] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[35] Jason Weston,et al. A Neural Attention Model for Abstractive Sentence Summarization , 2015, EMNLP.
[36] Yue Zhang,et al. Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks , 2018, ArXiv.
[37] Mirella Lapata,et al. Coarse-to-Fine Decoding for Neural Semantic Parsing , 2018, ACL.
[38] Richard Socher,et al. Ask Me Anything: Dynamic Memory Networks for Natural Language Processing , 2015, ICML.
[39] Richard Socher,et al. Learned in Translation: Contextualized Word Vectors , 2017, NIPS.
[40] Danqi Chen,et al. Position-aware Attention and Supervised Data Improve Slot Filling , 2017, EMNLP.
[41] Mihai Surdeanu,et al. The Stanford CoreNLP Natural Language Processing Toolkit , 2014, ACL.
[42] Richard Socher,et al. Dynamic Coattention Networks For Question Answering , 2016, ICLR.
[43] Ruslan Salakhutdinov,et al. Neural Models for Reasoning over Multiple Mentions Using Coreference , 2018, NAACL.
[44] SangKeun Lee,et al. Dynamic Self-Attention : Computing Attention over Words Dynamically for Sentence Embedding , 2018, ArXiv.
[45] Percy Liang,et al. Know What You Don’t Know: Unanswerable Questions for SQuAD , 2018, ACL.
[46] Yuxing Peng,et al. Reinforced Mnemonic Reader for Machine Comprehension , 2017 .
[47] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[48] Jason Weston,et al. End-To-End Memory Networks , 2015, NIPS.
[49] Quoc V. Le,et al. QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension , 2018, ICLR.
[50] Richard Socher,et al. Efficient and Robust Question Answering from Minimal Context over Documents , 2018, ACL.
[51] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[52] Yoshimasa Tsuruoka,et al. A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks , 2016, EMNLP.