Student Responses to and Perceptions of Feedback Received on a Series of Model-Eliciting Activities: A Case Study

One challenge in implementing open-ended problems i s a sessing students’ responses because the open-ended nature of the problems allow for num erous suitable, “good” responses. Specifically, formative assessment providing the students with feedback on intermittent solutions can be especially challenging when it i s hoped that students will understand and respond to the feedback in ways that indicate learn ing has taken place. The aim of this study is to examine how students in a first-year engineering course perceive and respond to feedback received from a Graduate Teaching Assistant (GTA) a nd their peers as they iterate through multiple drafts of their solutions to Model-Eliciti ng Activities (MEAs). In this paper, we report case findings based upon three interviews each from our students from a single team that participated in the interviews following three MEAs implemented in a single semester. Findings indicated all four students struggled with the feed back received from their peers. The students agreed GTA feedback was helpful in improving their MEA solutions and was more useful than the peer feedback. However, the students had contr adic ory perceptions of the level of specificity and vagueness in the GTA feedback. Thi s study supports the notion that students need training and education both in how to give fee dback as well as how to respond to feedback.

[1]  Heidi A. Diefes-Dux,et al.  Impact Of Feedback And Revision On Student Team Solutions To Model Eliciting Activities , 2008 .

[2]  Matthew A. Verleger,et al.  Student reflections on peer reviewing solutions to Model-Eliciting Activities , 2009, 2009 39th IEEE Frontiers in Education Conference.

[3]  Richard Lesh,et al.  Model-Eliciting Activities for Engineering Education , 2008 .

[4]  S. Gorard,et al.  'They don't give us our marks': The role of formative feedback in student progress , 2005 .

[5]  Colin Robson,et al.  Real World Research: A Resource for Social Scientists and Practitioner-Researchers , 1993 .

[6]  C. Gipps,et al.  Assessment : a teacher's guide to the issues , 1990 .

[7]  P. K. Imbrie,et al.  Modeling Activities in a First-YEAR Engineering Course , 2008 .

[8]  N. Falchikov,et al.  Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks , 2000 .

[9]  K. Topping Peer Assessment Between Students in Colleges and Universities , 1998 .

[10]  Matthew Verleger Analysis of an informed peer review matching algorithm and its impact on student work on model-eliciting activities , 2009 .

[11]  Steven E. Stemler,et al.  An Overview of Content Analysis. , 2001 .

[12]  Mark N. Hoover,et al.  Principles for Developing Thought-Revealing Activities for Students and Teachers , 2000 .

[13]  P. Black,et al.  Inside the Black Box: Raising Standards through Classroom Assessment , 2010 .

[14]  M. Patton Qualitative research & evaluation methods , 2002 .