Incorporating External Knowledge to Boost Machine Comprehension Based Question Answering

We propose an effective knowledge representation network via a two-level attention mechanism, called KRN, to represent the background knowledge of entities in documents for boosting machine comprehension (MC). In experiments, we incorporated the KRN into several state-of-the-art MC models such as AS Reader, CAS Reader, GA Reader and BiDAF, and evaluated the performance of KRN using two datasets: WebQA and Quasar-T. Experimental results show that our KRN can improve the performance of the existing MC models.

[1]  William W. Cohen,et al.  Quasar: Datasets for Question Answering by Search and Reading , 2017, ArXiv.

[2]  Ting Liu,et al.  Attention-over-Attention Neural Networks for Reading Comprehension , 2016, ACL.

[3]  Qinmin Hu,et al.  Enhancing Recurrent Neural Networks with Positional Attention for Question Answering , 2017, SIGIR.

[4]  Mihai Surdeanu,et al.  The Stanford CoreNLP Natural Language Processing Toolkit , 2014, ACL.

[5]  Ali Farhadi,et al.  Bidirectional Attention Flow for Machine Comprehension , 2016, ICLR.

[6]  Jun Zhao,et al.  Collective entity linking in web text: a graph-based method , 2011, SIGIR.

[7]  Jason Weston,et al.  Reading Wikipedia to Answer Open-Domain Questions , 2017, ACL.

[8]  Ruslan Salakhutdinov,et al.  Gated-Attention Readers for Text Comprehension , 2016, ACL.

[9]  Yuxing Peng,et al.  Reinforced Mnemonic Reader for Machine Comprehension , 2017 .

[10]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[11]  Ming Zhou,et al.  Reinforced Mnemonic Reader for Machine Reading Comprehension , 2017, IJCAI.

[12]  Yang Yu,et al.  End-to-End Reading Comprehension with Dynamic Answer Chunk Ranking , 2016, ArXiv.

[13]  Ting Liu,et al.  Consensus Attention-based Neural Networks for Chinese Reading Comprehension , 2016, COLING.

[14]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[15]  Rudolf Kadlec,et al.  Text Understanding with the Attention Sum Reader Network , 2016, ACL.

[16]  Philip Bachman,et al.  Natural Language Comprehension with the EpiReader , 2016, EMNLP.

[17]  Ming Zhou,et al.  Gated Self-Matching Networks for Reading Comprehension and Question Answering , 2017, ACL.

[18]  Yelong Shen,et al.  ReasoNet: Learning to Stop Reading in Machine Comprehension , 2016, CoCo@NIPS.

[19]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.