暂无分享,去创建一个
[1] Peter Norvig. A Unified Theory of Inference for Text Understanding , 1986 .
[2] Qiang Wu,et al. Adapting boosting for information retrieval measures , 2010, Information Retrieval.
[3] Matthew Richardson,et al. MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.
[4] Christopher J.C. Burges,et al. Towards the Machine Comprehension of Text: An Essay , 2013 .
[5] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[6] Peter Clark,et al. Modeling Biological Processes for Reading Comprehension , 2014, EMNLP.
[7] Danqi Chen,et al. A Fast and Accurate Dependency Parser using Neural Networks , 2014, EMNLP.
[8] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[9] Jason Weston,et al. End-To-End Memory Networks , 2015, NIPS.
[10] Jason Weston,et al. Memory Networks , 2014, ICLR.
[11] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[12] David A. McAllester,et al. Machine Comprehension with Syntax, Frames, and Semantics , 2015, ACL.
[13] Phil Blunsom,et al. Teaching Machines to Read and Comprehend , 2015, NIPS.
[14] Eric P. Xing,et al. Learning Answer-Entailing Structures for Machine Comprehension , 2015, ACL.
[15] Jason Weston,et al. The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations , 2015, ICLR.
[16] Richard Socher,et al. Ask Me Anything: Dynamic Memory Networks for Natural Language Processing , 2015, ICML.
[17] Jianfeng Gao,et al. Reasoning in Vector Space: An Exploratory Study of Question Answering , 2016, ICLR.
[18] Jason Weston,et al. Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks , 2015, ICLR.
[19] Naoaki Okazaki,et al. Dynamic Entity Representation with Max-pooling Improves Machine Reading , 2016, NAACL.
[20] Rudolf Kadlec,et al. Text Understanding with the Attention Sum Reader Network , 2016, ACL.