Marking Essays Automatically

We propose an automatic marking scheme for grading essay type assessment. The scheme is built on top of the BERT model. First, it obtains the meaning of the sentences in the students' answers and the specimen answer. Then, it uses a classifier to determine the similarity between the meaning of the students' answers and the specimen answer. Experiments show that the scheme achieves high accuracy when marking assignments of specific topics.

[1]  Sathiamoorthy Manoharan,et al.  Machine Learning Techniques to Automate Scoring of Constructed-Response Type Assessments , 2018, 2018 28th EAEEIE Annual Conference (EAEEIE).

[2]  Sathiamoorthy Manoharan,et al.  Providing automated grading and personalized feedback , 2019, AIIPCC '19.

[3]  Kevin Gimpel,et al.  Towards Universal Paraphrastic Sentence Embeddings , 2015, ICLR.

[4]  M. Marelli,et al.  SemEval-2014 Task 1: Evaluation of Compositional Distributional Semantic Models on Full Sentences through Semantic Relatedness and Textual Entailment , 2014, *SEMEVAL.

[5]  Sanjeev Arora,et al.  A Simple but Tough-to-Beat Baseline for Sentence Embeddings , 2017, ICLR.

[6]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[7]  Jian Jiao,et al.  TwinBERT: Distilling Knowledge to Twin-Structured BERT Models for Efficient Retrieval , 2020, ArXiv.

[8]  Julie G. Nyquist,et al.  How Learning Works: Seven Research-Based Principles for Smart Teaching , 2012 .

[9]  Christopher D. Manning,et al.  Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.

[10]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[11]  Yiming Yang,et al.  XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.

[12]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[13]  Omer Levy,et al.  RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.

[14]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[15]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[16]  Iryna Gurevych,et al.  Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks , 2019, EMNLP.

[17]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.