Relevance-Based Automated Essay Scoring via Hierarchical Recurrent Model

In recent years, neural network models have been used in automated essay scoring task and achieved good performance. However, few studies investigated using the prompt information into the neural network. We know that there is a close relevance between the essay content and the topic. Therefore, the relevance between the essay and the topic can aid to represent the relationship between the essay and its score. That is to say, the degree of relevance between the high score essay and the topic will be higher while the low score essay is less similar to the topic. Inspired by this idea, we propose to use the similarity of the essay and the topic as auxiliary information which can be concatenated into the final representation of the essay. We first use a hierarchical recurrent neural network combined with attention mechanism to learn the content representation of the essay and the topic on sentence-level and document-level. Then, we multiply the essay representation and the topic representation to get a similarity representation between them. In the end, we concatenate the similarity representation into the essay's representation to get a final representation of the essay. We tested our model on ASAP dataset and the experimental results show that our model outperformed the existing state-of-the-art models.

[1]  Yue Zhang,et al.  Attention-based Recurrent Convolutional Neural Network for Automatic Essay Scoring , 2017, CoNLL.

[2]  Haoran Zhang,et al.  Co-Attention Based Neural Network for Source-Dependent Essay Scoring , 2018, BEA@NAACL-HLT.

[3]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[4]  Lawrence M. Rudner,et al.  Automated Essay Scoring Using Bayes' Theorem , 2002 .

[5]  Peter W. Foltz,et al.  Automated Essay Scoring: Applications to Educational Technology , 1999 .

[6]  E. B. Page Computer Grading of Student Prose, Using Modern Concepts and Software , 1994 .

[7]  Siu Cheung Hui,et al.  SkipFlow: Incorporating Neural Coherence Features for End-to-End Automatic Text Scoring , 2017, AAAI.

[8]  Helen Yannakoudakis,et al.  Automatic Text Scoring Using Neural Networks , 2016, ACL.

[9]  Helen Yannakoudakis,et al.  A New Dataset and Method for Automatically Grading ESOL Texts , 2011, ACL.

[10]  Yoshua Bengio,et al.  Show, Attend and Tell: Neural Image Caption Generation with Visual Attention , 2015, ICML.

[11]  Neil T. Heffernan,et al.  A Memory-Augmented Neural Model for Automated Grading , 2017, L@S.

[12]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[13]  Yue Zhang,et al.  Automatic Features for Essay Scoring – An Empirical Study , 2016, EMNLP.

[14]  Leah S. Larkey,et al.  Automatic essay grading using text categorization techniques , 1998, SIGIR '98.

[15]  Hwee Tou Ng,et al.  Flexible Domain Adaptation for Automated Essay Scoring Using Correlated Linear Regression , 2015, EMNLP.

[16]  Jill Burstein,et al.  AUTOMATED ESSAY SCORING WITH E‐RATER® V.2.0 , 2004 .

[17]  Patrick F. Reidy An Introduction to Latent Semantic Analysis , 2009 .

[18]  Hwee Tou Ng,et al.  A Neural Approach to Automated Essay Scoring , 2016, EMNLP.

[19]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[20]  Daniel Jurafsky,et al.  A Hierarchical Neural Autoencoder for Paragraphs and Documents , 2015, ACL.