Residual Connection-Based Multi-step Reasoning via Commonsense Knowledge for Multiple Choice Machine Reading Comprehension

Generally, the candidate options for multiple choice machine reading comprehension (MRC) are not explicitly present in the document and need to be inferred from text or even from the world’s knowledge. Previous work endeavored to improve performance with the aid of commonsense knowledge or using multi-step reasoning strategy. However, there is no model adopt multi-step reasoning with external commonsense knowledge information to solve multiple choice MRC, and two shortcomings still remain unsolved, i.e., external knowledge may involve undesirable noise and only the latest reasoning step makes contribution to the next reasoning. To address the above issues, we propose a multi-step reasoning neural network based on the strong Co-Matching model with the aid of commonsense knowledge. Firstly, we present a sentence-level knowledge interaction (SKI) module to integrate commonsense knowledge with corresponding sentence rather than the whole MRC instance. Secondly, we present a residual connection-based multi-step reasoning (RCMR) answer module, which makes the next reasoning depending on the integration of several early reasoning steps rather than only the latest reasoning step. The comparative experimental results on MCScript show that our single model achieves a promising result comparable to SOTA single model with extra samples and specifically achieves the best result for commonsense type questions.

[1]  Ting Liu,et al.  Document Modeling with Gated Recurrent Neural Network for Sentiment Classification , 2015, EMNLP.

[2]  Jian Zhang,et al.  SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.

[3]  Jane Yung-jen Hsu,et al.  Community-based game design: experiments on social games for commonsense data collection , 2009, HCOMP '09.

[4]  Simon Ostermann,et al.  SemEval-2018 Task 11: Machine Comprehension Using Commonsense Knowledge , 2018, *SEMEVAL.

[5]  Catherine Havasi,et al.  ConceptNet 5.5: An Open Multilingual Graph of General Knowledge , 2016, AAAI.

[6]  Guokun Lai,et al.  RACE: Large-scale ReAding Comprehension Dataset From Examinations , 2017, EMNLP.

[7]  Shiyu Chang,et al.  A Co-Matching Model for Multi-choice Reading Comprehension , 2018, ACL.

[8]  Xiaojun Quan,et al.  Gated Convolutional Networks for Commonsense Machine Comprehension , 2018, ICONIP.

[9]  Min Tang,et al.  Multi-Matching Network for Multiple Choice Reading Comprehension , 2019, AAAI.

[10]  Todor Mihaylov,et al.  Knowledgeable Reader: Enhancing Cloze-Style Reading Comprehension with External Commonsense Knowledge , 2018, ACL.

[11]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[12]  Simon Ostermann,et al.  MCScript: A Novel Dataset for Assessing Machine Comprehension Using Script Knowledge , 2018, LREC.

[13]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[14]  Wei Zhao,et al.  Yuanfudao at SemEval-2018 Task 11: Three-way Attention and Relational Knowledge for Commonsense Machine Comprehension , 2018, SemEval@NAACL-HLT.

[15]  Xiaodong Liu,et al.  Stochastic Answer Networks for Machine Reading Comprehension , 2017, ACL.

[16]  Manuel Blum,et al.  Verbosity: a game for collecting common-sense facts , 2006, CHI.

[17]  Furu Wei,et al.  Hierarchical Attention Flow for Multiple-Choice Reading Comprehension , 2018, AAAI.

[18]  Phil Blunsom,et al.  Teaching Machines to Read and Comprehend , 2015, NIPS.

[19]  Christopher D. Manning,et al.  Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.

[20]  Mitesh M. Khapra,et al.  ElimiNet: A Model for Eliminating Options for Reading Comprehension with Multiple Choice Questions , 2018, IJCAI.

[21]  Francis Bond,et al.  Linking and Extending an Open Multilingual Wordnet , 2013, ACL.

[22]  John C. Henderson,et al.  MITRE at SemEval-2018 Task 11: Commonsense Reasoning without Commonsense Knowledge , 2018, *SEMEVAL.

[23]  Yelong Shen,et al.  ReasoNet: Learning to Stop Reading in Machine Comprehension , 2016, CoCo@NIPS.

[24]  Ruslan Salakhutdinov,et al.  Gated-Attention Readers for Text Comprehension , 2016, ACL.