Comparative Student Experiences on Electronic Examining in a Programming Course - Case C

Courses in C programming at two Finnish universities were assessed with electronic exams. In the study setting, two types of electronic exams were used: lecture hall exams and exam studio exams. Student experiences were collected with surveys and interviews, and system data was used for exam statistics. The results were compared between exam types and between universities. The results show that electronic exams are perceived by the students as more realistic and natural in programming exams than traditional pen and paper exams. Thus, electronic exams support the development of working life skills above pen and paper exams. Students in the lecture hall exam described challenges not relevant in the exam studio exam, and on the other hand, students in the exam studio exam described benefits not available in the lecture hall exam. Based on the study, electronic exams are strongly recommended for programming courses using exams for summative assessment. In addition, programming environments are recommended for added authenticity in reflection to working-life skills, and exam studios are recommended because of the added values they provide compared to lecture hall exams.

[1]  Robert D. Macredie,et al.  Teaching Introductory Programming: A Quantitative Evaluation of Different Approaches , 2015, TOCE.

[2]  Daniel Zingaro,et al.  Reviewing CS1 exam question content , 2011, SIGCSE '11.

[3]  Anni Rytkönen,et al.  Student experiences on taking electronic exams at the University of Helsinki , 2014 .

[4]  Henrik Nygren,et al.  Tracking Students' Internet Browsing in a Machine Exam , 2017 .

[5]  James Skene,et al.  Introductory programming: examining the exams , 2012, ACE 2012.

[6]  Carlos Costa,et al.  Learning computer programming: study of difficulties in learning programming , 2013, ISDOC.

[7]  Timo Mäkinen,et al.  UTILIZING ELECTRONIC EXAMS IN PROGRAMMING COURSES: A CASE STUDY , 2016 .

[8]  Guttorm Sindre,et al.  E-exams and exam process improvement , 2015, NIK.

[9]  Jens Bennedsen,et al.  Assessing Process and Product , 2007 .

[10]  Arto Vihavainen,et al.  Extreme apprenticeship method in teaching programming for beginners , 2011, SIGCSE.

[11]  Kari Smolander,et al.  Student self-assessment in a programming course using bloom's revised taxonomy , 2010, ITiCSE '10.

[12]  Olly Gotel,et al.  A Motivation Guided Holistic Rehabilitation of the First Programming Course , 2011, TOCE.

[13]  Arto Vihavainen,et al.  Predicting Students' Performance in an Introductory Programming Course Using Data from Students' Own Programming Process , 2013, 2013 IEEE 13th International Conference on Advanced Learning Technologies.

[14]  Tapio Salakoski,et al.  Automatically assessed electronic exams in programming courses , 2016, ACSW.

[15]  John W. Creswell,et al.  Designing and Conducting Mixed Methods Research , 2006 .

[16]  Mathew Hillier e-Exams with student owned devices: Student voices , 2015 .

[17]  Tapio Salakoski,et al.  Programming Misconceptions in an Introductory Level Programming Course Exam , 2016, ITiCSE.

[18]  A. Karimi,et al.  Master‟s thesis , 2011 .

[19]  Erkki Kaila,et al.  Redesigning an Object-Oriented Programming Course , 2016, TOCE.

[20]  G. Boulton‐Lewis Teaching for quality learning at university , 2008 .

[21]  Diana Linville,et al.  Testing Frequency in an Introductory Computer Programming Course , 2017 .