Automating the Assessment of Problem-solving Practices Using Log Data and Data Mining Techniques

Interactive simulations provide an exciting opportunity to assess and teach students the practices used by scientists and engineers to solve real-world problems. This study examines how the logged interaction data from a simulation-based task could be used to automate the assessment of complex problem-solving practices. A total of 73 college students worked on an interactive circuit puzzle embedded in a science simulation in an interview setting. Their problem-solving processes were videotaped and logged in the backend of the simulation. We extracted different sets of features from the log data and evaluated their effectiveness as predictors of students' problem-solving success and evidence for specific problem-solving practices. Our results indicate that the application of data mining techniques guided by knowledge gained from qualitative observation was instrumental in the discovery of semantically meaningful features from the raw log data. These knowledge-grounded features were significant predictors of students' overall problem-solving success and provided evidence on how well they adopted specific problem-solving practices, including decomposition, data collection, and data recording. The results point to promising directions for how scaffolding/feedback could be provided in educational simulations to enhance student learning in problem-solving skills.

[1]  Morris De Beer,et al.  Technology and Testing , 2013 .

[2]  Janice D. Gobert,et al.  From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining , 2013, Journal of the Learning Sciences.

[3]  James W. Pellegrino,et al.  Applying machine learning in science assessment: a systematic review , 2020, Studies in Science Education.

[4]  C. Wieman,et al.  PhET: Interactive Simulations for Teaching and Learning Physics , 2006 .

[5]  Cristina Conati,et al.  AI in Education needs interpretable machine learning: Lessons from Open Learner Modelling , 2018, ArXiv.

[6]  John S. Kinnebrew,et al.  A Contextualized, Differential Sequence Mining Method to Derive Students' Learning Behavior Patterns , 2013, EDM 2013.

[7]  R. Nehm,et al.  A Meta-Analysis of Machine Learning-Based Science Assessments: Factors Impacting Machine-Human Score Agreements , 2020, Journal of Science Education and Technology.

[8]  Margaret Wu,et al.  Modelling mathematics problem solving item responses using a multidimensional IRT model , 2006 .

[9]  Leonidas J. Guibas,et al.  Deep Knowledge Tracing , 2015, NIPS.

[10]  G. Kortemeyer The Losing Battle Against Plug-and-Chug , 2016 .

[11]  Jody Clarke,et al.  Towards general models of effective science inquiry in virtual performance assessments , 2016, J. Comput. Assist. Learn..

[12]  Ido Roll,et al.  Identifying Productive Inquiry in Virtual Labs Using Sequence Mining , 2017, AIED.

[13]  Mary Webb,et al.  Challenges for information technology supporting educational assessment , 2013, J. Comput. Assist. Learn..

[14]  Ronny Scherer,et al.  Identifying patterns of students' performance on simulated inquiry tasks using PISA 2015 log‐file data , 2020 .

[15]  Paul A. Kirschner,et al.  Data mining in educational technology classroom research: Can it make a contribution? , 2017, Comput. Educ..

[16]  Niels Pinkwart,et al.  A framework to foster problem-solving in STEM and computing education , 2020, Research in Science & Technological Education.

[17]  Chris Dede,et al.  Assessment, Technology, and Change , 2010 .

[18]  Daniel L. Schwartz,et al.  Modeling exploration strategies to predict student performance within a learning environment and beyond , 2017, LAK.

[19]  Samuel Greiff,et al.  Computer-generated log-file analyses as a window into students' minds? A showcase study based on the PISA 2012 assessment of problem solving , 2015, Comput. Educ..

[20]  Cristina Conati,et al.  Combining Unsupervised and Supervised Classification to Build User Models for Exploratory , 2009, EDM 2009.

[21]  Ryan S. Baker,et al.  Educational Data Mining and Learning Analytics , 2014 .

[22]  Ngss Lead States Next generation science standards : for states, by states , 2013 .

[23]  R. Bennett Educational Assessment: What to Watch in a Rapidly Changing World , 2018, Educational Measurement: Issues and Practice.

[24]  Candace Thille,et al.  The Future of Data-Enriched Assessment. , 2014 .

[25]  Donald R. Woods,et al.  An Evidence‐Based Strategy for Problem Solving , 2000 .