Employing External Rich Knowledge for Machine Comprehension

Recently proposed machine comprehension (MC) application is an effort to deal with natural language understanding problem. However, the small size of machine comprehension labeled data confines the application of deep neural networks architectures that have shown advantage in semantic inference tasks. Previous methods use a lot of NLP tools to extract linguistic features but only gain little improvement over simple baseline. In this paper, we build an attention-based recurrent neural network model, train it with the help of external knowledge which is semantically relevant to machine comprehension, and achieves a new state-of-the-art result.

[1]  Christopher Potts,et al.  A large annotated corpus for learning natural language inference , 2015, EMNLP.

[2]  Andreas Vlachos,et al.  A Strong Lexical Matching Method for the Machine Comprehension Test , 2015, EMNLP.

[3]  Noah A. Smith,et al.  What is the Jeopardy Model? A Quasi-Synchronous Grammar for QA , 2007, EMNLP.

[4]  Bowen Zhou,et al.  Applying deep learning to answer selection: A study and an open task , 2015, 2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU).

[5]  Phil Blunsom,et al.  Reasoning about Entailment with Neural Attention , 2015, ICLR.

[6]  Matthew Richardson,et al.  MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text , 2013, EMNLP.

[7]  Regina Barzilay,et al.  Machine Comprehension with Discourse Relations , 2015, ACL.

[8]  Jason Weston,et al.  Memory Networks , 2014, ICLR.

[9]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[10]  Eric P. Xing,et al.  Learning Answer-Entailing Structures for Machine Comprehension , 2015, ACL.

[11]  Bowen Zhou,et al.  ABCNN: Attention-Based Convolutional Neural Network for Modeling Sentence Pairs , 2015, TACL.

[12]  Richard Socher,et al.  Ask Me Anything: Dynamic Memory Networks for Natural Language Processing , 2015, ICML.

[13]  Roman Grundkiewicz,et al.  Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing , 2015, EMNLP 2015.

[14]  Jason Weston,et al.  Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks , 2015, ICLR.

[15]  Noah A. Smith,et al.  Tree Edit Models for Recognizing Textual Entailments, Paraphrases, and Answers to Questions , 2010, NAACL.

[16]  Peter Clark,et al.  Modeling Biological Processes for Reading Comprehension , 2014, EMNLP.

[17]  Eugene Agichtein,et al.  Factoid Question Answering over Unstructured and Structured Web Content , 2005, TREC.

[18]  Razvan Pascanu,et al.  On the difficulty of training recurrent neural networks , 2012, ICML.

[19]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[20]  Phil Blunsom,et al.  Teaching Machines to Read and Comprehend , 2015, NIPS.

[21]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[22]  Peter Kulchyski and , 2015 .

[23]  David A. McAllester,et al.  Machine Comprehension with Syntax, Frames, and Semantics , 2015, ACL.

[24]  Jason Weston,et al.  End-To-End Memory Networks , 2015, NIPS.

[25]  Bowen Zhou,et al.  LSTM-based Deep Learning Models for non-factoid answer selection , 2015, ArXiv.

[26]  Yi Yang,et al.  WikiQA: A Challenge Dataset for Open-Domain Question Answering , 2015, EMNLP.

[27]  Volume 72 , 2005, Environmental Biology of Fishes.