Concepts for E-Assessments in STEM on the Example of Engineering Mechanics

We discuss if and how it is possible to develop meaningful e-assessments in Engineering Mechanics. The focus is on complex example problems, resembling traditional paper-pencil exams. Moreover, the switch to e-assessments should be as transparent as possible for the students, i.e., it shouldn’t lead to additional difficulties, while still maintaining sufficiently high discrimination indices for all questions. Example problems have been designed in such a way, that it is possible to account for a great variety of inputs ranging from graphical to numerical and algebraic as well as string input types. Thanks to the implementation of random variables it is even possible to create an individual set of initial values for every participant. Additionally, when dealing with complex example problems errors carried forward have to be taken into account. Different approaches to do so are detailed and discussed, e.g., pre-defined paths for sub-questions, usage of students’ previous inputs or decision trees. The main finding is that complex example problems in Engineering Mechanics can very well be used in e-assessments if the design of these questions is well structured into meaningful sub-questions and errors carried forward are accounted for.

[1]  Austin Gibbons,et al.  Learning Analytics , 2014, Encyclopedia of Social Network Analysis and Mining.

[2]  B. Cook,et al.  Active learning through online quizzes: better learning and less (busy) work , 2017 .

[3]  Chris Sangwin,et al.  Computer Aided Assessment of Mathematics Using STACK , 2015 .

[4]  Uno Fors,et al.  Measuring Learner Satisfaction with Formative e-Assessment Strategies , 2019, Int. J. Emerg. Technol. Learn..

[5]  M. Antonia Huertas AN E-ASSESSMENT ANALYTICS FRAMEWORK FOR STEM IN HIGHER EDUCATION , 2016 .

[6]  Monica B. Behrend,et al.  Optimising Moodle quizzes for online assessments , 2019, International Journal of STEM Education.

[7]  Martin Ebner,et al.  Development of a Quiz - Implementation of a (Self-) Assessment Tool and its Integration in Moodle , 2019, iJET.

[8]  Marko Tuomo Tapani Neitola Circuit Theory E-Assessment Realized in an Open-Source Learning Environment , 2019, Int. J. Eng. Pedagog..

[9]  John D Mahan,et al.  Utility of intermittent online quizzes as an early warning for residents at risk of failing the pediatric board certification examination , 2018, BMC Medical Education.

[10]  Martin Ebner,et al.  Development of a Dashboard for Learning Analytics in Higher Education , 2017, HCI.

[11]  NTTI,et al.  On automatic assessment and conceptual understanding , 2015 .

[12]  Antti Rasila E-ASSESSMENT MATERIAL BANK ABACUS , 2016 .

[13]  Martin Ebner,et al.  Learning Analytics: Einsatz an österreichischen Hochschulen , 2019 .

[14]  S Gruttmann,et al.  Prüfen mit Computer und Internet. Didaktik, Methodik und Organisation von E-Assessment , 2013 .

[15]  Donita Cohen,et al.  Online quizzes in a virtual learning environment as a tool for formative assessment , 2016 .

[16]  Martin Ebner,et al.  Technologie in der Hochschullehre - Rahmenbedingungen, Strukturen und Modelle , 2013 .

[17]  Antti Rasila,et al.  Automatic assessment in engineering mathematics: evaluation of the impact , 2010 .

[18]  Martin Ebner,et al.  Learning Analytics in Higher Education—A Literature Review , 2017 .

[19]  Thomas Antretter,et al.  E-ASSESSMENT IN ENGINEERING MECHANICS: HOW DOES IT COMPARE TO CLASSICAL PAPER-PENCIL EXAMS? , 2019, ICERI2019 Proceedings.

[20]  Markus Orthaber EXPERIENCES WITH A BLENDED LEARNING CONCEPT IN A FIRST YEAR ENGINEERING MECHANICS COURSE , 2019 .

[21]  Christopher F. Bauer,et al.  Student perceptions of immediate feedback testing in student centered chemistry classes , 2018 .

[22]  Ken Wojcikowski,et al.  Immediate detailed feedback to test-enhanced learning: An effective online educational tool , 2013, Medical teacher.

[23]  Kai Zenger,et al.  Automatic assessment of mathematics exercises: Experiences and future prospects , 2007 .