In search of learning: facilitating data analysis in educational games

The field of Educational Games has seen many calls for added rigor. One avenue for improving the rigor of the field is developing more generalizable methods for measuring student learning within games. Throughout the process of development, what is relevant to measure and assess may change as a game evolves into a finished product. The field needs an approach for game developers and researchers to be able to prototype and experiment with different measures that can stand up to rigorous scrutiny, as well as provide insight into possible new directions for development. We demonstrate a toolkit and analysis tools that capture and analyze students' performance within open educational games. The system records relevant events during play, which can be used for analysis of player learning by designers. The tools support replaying student sessions within the original game's environment, which allows researchers and developers to explore possible explanations for student behavior. Using this system, we were able to facilitate a number of analyses of student learning in an open educational game developed by a team of our collaborators as well as gain greater insight into student learning with the game and where to focus as we iterate.

[1]  Bruce Phillips,et al.  Tracking real-time user experience (TRUE): a comprehensive instrumentation solution for complex systems , 2008, CHI.

[2]  Noah S. Podolefsky Learning science through computer games and simulations , 2012 .

[3]  Antonija Mitrovic,et al.  On Using Learning Curves to Evaluate ITS , 2005, AIED.

[4]  D. Ketelhut The Impact of Student Self-efficacy on Scientific Inquiry Skills: An Exploratory Investigation in River City, a Multi-user Virtual Environment , 2007 .

[5]  Kenneth R. Koedinger,et al.  The Rise of the Super Experiment , 2012, EDM.

[6]  Vincent Aleven,et al.  RumbleBlocks: Teaching science concepts to young children through a Unity game , 2012, 2012 17th International Conference on Computer Games (CGAMES).

[7]  Yun-En Liu,et al.  The impact of tutorials on games of varying complexity , 2012, CHI.

[8]  Russell G. Almond,et al.  On the Structure of Educational Assessments, CSE Technical Report. , 2003 .

[9]  Cristina Conati,et al.  Modelling Learning in an Educational Game , 2005, AIED.

[10]  Michael K. Thomas,et al.  Making learning fun: Quest Atlantis, a game without guns , 2005 .

[11]  David A. Gillam,et al.  A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas , 2012 .

[12]  John R. Anderson,et al.  Knowledge tracing: Modeling the acquisition of procedural knowledge , 2005, User Modeling and User-Adapted Interaction.

[13]  Michael F. Young,et al.  Our Princess Is in Another Castle , 2012 .

[14]  Kenneth R. Koedinger,et al.  A Data Repository for the EDM Community: The PSLC DataShop , 2010 .

[15]  V. Shute SteAlth ASSeSSment in computer-BASed GAmeS to Support leArninG , 2011 .

[16]  James W. Pellegrino,et al.  The challenge of assessing learning in open games: HORTUS as a case study , 2011 .

[17]  Ryan Shaun Joazeiro de Baker,et al.  Modeling the Acquisition of Fluent Skill in Educational Action Games , 2007, User Modeling.

[18]  James Paul Gee,et al.  What video games have to teach us about learning and literacy , 2007, CIE.

[19]  Allen and Rosenbloom Paul S. Newell,et al.  Mechanisms of Skill Acquisition and the Law of Practice , 1993 .

[20]  Gregory K. W. K. Chung,et al.  Validity Evidence for Games as Assessment Environments. CRESST Report 773. , 2010 .