Adopting Automated Essay Scoring Feedback In Malaysia: A Review Of The Literature

Assessing essays and providing feedback to learners is undoubtedly a daunting, time consuming task for language teachers especially for formative assessment. Formative assessment requires feedbacks that indicate learning gaps that inform ideas for further improvement. Although providing high quality feedback is important, teachers are often in dilemma to beat the deadline and hold accountable to huge class size despite the need for iterative and frequent practice. The laboriousness of essay marking often limits the occurrence of essay writing in classroom. A possible solution is to make use of an Automated Essay Scorer (AES) system that could score and generate feedback immediately. The purpose of this paper is to review on the importance of feedback and the feasibility of adopting an automated mechanism to help Malaysian English teachers to mark essays and provide feedbacks instantly. Features of good feedback, commercially available AES, reasons in favour of automated feedback based on the Malaysian context and some limitations of the system are examined. Our findings suggest that relevant and effective automated feedback mechanism can be possible through home-grown AES supplemented with properly phrased feedbacks. It can lead to self-regulated learning that empowers them for sustainable development. It is a laborious but attainable task with the cooperation and supports all parties. Hence, introducing Automated Essay Scoring Feedback (AESF), a home-grown AES is a feasible and highly anticipated tool for the Malaysian classroom.

[1]  Martin Chodorow,et al.  Automated Essay Scoring for Nonnative English Speakers , 1999 .

[2]  Su Luan Wong,et al.  Gender Differences in Attitudes towards Information Technology among Malaysian Student Teachers: A Case Study at Universiti Putra Malaysia , 2007, J. Educ. Technol. Soc..

[3]  Benoît Lemaire,et al.  A System to Assess the Semantic Content of Student Essays , 2001 .

[4]  Pam Parker,et al.  Providing Written Assessment Feedback that Students will Value and Read , 2009 .

[5]  Maria Ornella Treglia Feedback on Feedback: Exploring Student Responses to Teachers' Written Commentary , 2008 .

[6]  Sylvia L. Edwards,et al.  Investigating the use of Web 2.0 technology by Malaysian students , 2010 .

[7]  Mark Warschauer,et al.  Automated writing evaluation: defining the classroom research agenda , 2006 .

[8]  Jill Burstein,et al.  The E-rater® scoring engine: Automated essay scoring with natural language processing. , 2003 .

[9]  Scott Windeatt,et al.  The impact of computer-based feedback on students’ written work , 2010 .

[10]  Gerhard Leitner,et al.  Contact Expressions in Contemporary Malaysian English , 2011 .

[11]  D. Boud,et al.  Rethinking models of feedback for learning: the challenge of design , 2013 .

[12]  Klaus Zechner,et al.  Automated Essay Scoring: Writing Assessment and Instruction , 2010 .

[13]  Denise Whitelock,et al.  Analysing tutor feedback to students: first steps towards constructing an electronic monitoring system , 2003 .

[14]  Jill Burstein,et al.  Automated Essay Scoring : A Cross-disciplinary Perspective , 2003 .

[15]  Trevor Barker,et al.  An Automated Individual Feedback and Marking System: An Empirical Study. , 2011 .

[16]  Semire Dikli,et al.  The Nature of Automated Essay Scoring Feedback , 2010 .

[17]  Lawrence M. Rudner,et al.  An Overview of Three Approaches to Scoring Written Essays by Computer. ERIC Digest. , 2001 .

[18]  Ying-Jian Wang,et al.  Exploring the impact of using automated writing evaluation in English as a foreign language university students' writing , 2013 .

[19]  Chi-Fen Emily Chen,et al.  Beyond the Design of Automated Writing Evaluation: Pedagogical Practices and Perceived Learning Effectiveness in EFL Writing Classes. , 2008 .

[20]  Henk Louw,et al.  A practice-based evaluation of an on-line writing evaluation system: First-World technology in a Third-World teaching context , 2008 .

[21]  S. Quinton,et al.  Feeding forward: using feedback to promote student reflection and learning – a teaching model , 2010 .

[22]  Masoud Hashemi,et al.  ICT: Newwave in English language learning/teaching , 2011 .

[23]  Laurel Raymond,et al.  What a Writer Wants: Assessing Fulfillment of Student Goals in Writing Center Tutoring Sessions , 2012, Writing Center Journal.

[24]  K. Ellery,et al.  Assessment for learning: a case study using feedback effectively in an essay‐style test , 2008 .

[25]  E. B. Page Project Essay Grade: PEG. , 2003 .

[26]  Peter Flett,et al.  Knowledge management in Malaysian school education: Do the smart schools do it better? , 2011 .

[27]  Martin Chodorow,et al.  Stumping e-rater: challenging the validity of automated essay scoring , 2002, Comput. Hum. Behav..

[28]  Amos Paran,et al.  Teacher stance as reflected in feedback on student writing: An empirical study of secondary school teachers in five countries , 2007 .

[29]  Sara Cushing Weigle,et al.  English language learners and automated scoring of essays: Critical considerations , 2013 .

[30]  J. Hattie,et al.  The Power of Feedback , 2007 .

[31]  M. Warschauer,et al.  Automated Writing Assessment in the Classroom , 2008 .

[32]  T. Landauer,et al.  A Solution to Plato's Problem: The Latent Semantic Analysis Theory of Acquisition, Induction, and Representation of Knowledge. , 1997 .

[33]  J. McCabe,et al.  Student and Faculty Perceptions of E-Feedback , 2011 .

[34]  Kelvin C. K. Wong,et al.  Immediate web-based essay critiquing system feedback and teacher follow-up feedback on young second language learners' writings: an experimental study in a Hong Kong secondary school , 2013 .

[35]  Leopold Bayerlein,et al.  Students’ feedback preferences: how do students react to timely and automatically generated assessment feedback? , 2014 .

[36]  Marti A. Hearst The debate on automated essay grading , 2000 .

[37]  Beata Beigman Klebanov,et al.  Automated Essay Scoring , 2021, Synthesis Lectures on Human Language Technologies.