Two layers LSTM with attention for multi-choice question answering in exams

Question Answering in Exams is typical question answering task that aims to test how accurately the model could answer the questions in exams. In this paper, we use general deep learning model to solve the multi-choice question answering task. Our approach is to build distributed word embedding of question and answers instead of manually extracting features or linguistic tools, meanwhile, for improving the accuracy, the external corpus is introduced. The framework uses a two layers LSTM with attention which get a significant result. By contrast, we introduce the simple long short-term memory (QA-LSTM) model and QA-LSTM-CNN model and QA-LSTM with attention model as the reference. Experiment demonstrate superior performance of two layers LSTM with attention compared to other models in question answering task.