Leveraging Educational Data Mining for Real-time Performance Assessment of Scientific Inquiry Skills within Microworlds

We present Science Assistments, an interactive environment, which assesses students’ inquiry skills as they engage in inquiry using science microworlds. We frame our variables, tasks, assessments, and methods of analyzing data in terms of evidence-centered design. Specifically, we focus on the student model, the task model, and the evidence model in the conceptual assessment framework. In order to support both assessment and the provision of scaffolding, the environment makes inferences about student inquiry skills using models developed through a combination of text replay tagging [cf. Sao Pedro et al. 2011], a method for rapid manual coding of student log files, and educational data mining. Models were developed for multiple inquiry skills, with particular focus on detecting if students are testing their articulated hypotheses, and if they are designing controlled experiments. Student-level cross-validation was applied to validate that this approach can automatically and accurately identify these inquiry skills for new students. The resulting detectors also can be applied at run-time to drive scaffolding intervention.

[1]  Neil T. Heffernan,et al.  The ASSISTment Builder: Towards an Analysis of Cost Effectiveness of ITS Creation , 2006, FLAIRS Conference.

[2]  Christian D. Schunn,et al.  The transfer of logically general scientific reasoning skills , 2004 .

[3]  Cristina Conati,et al.  Combining Unsupervised and Supervised Classification to Build User Models for Exploratory , 2009, EDM 2009.

[4]  Ton de Jong Computer simulations. Technological advances in inquiry learning. , 2006, Science.

[5]  Robert S. Cohen,et al.  On Scientific Discovery , 1981 .

[6]  Vincent Aleven,et al.  More Accurate Student Modeling through Contextual Estimation of Slip and Guess Probabilities in Bayesian Knowledge Tracing , 2008, Intelligent Tutoring Systems.

[7]  Robert J. Mislevy,et al.  Automated scoring of complex tasks in computer-based testing , 2006 .

[8]  Herbert A. Simon,et al.  Scientific discovery , 1993, BMJ : British Medical Journal.

[9]  Marcia C. Linn,et al.  Impacts of students' experimentation using a dynamic visualization on their understanding of motion , 2008, ICLS.

[10]  T. Jong,et al.  Exploratory learning with a computer simulation for control theory: learning processes and instructional support , 1993 .

[11]  David A. Gillam,et al.  A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas , 2012 .

[12]  Wynne Hsu,et al.  Integrating Classification and Association Rule Mining , 1998, KDD.

[13]  M. Chi,et al.  Eliciting Self‐Explanations Improves Understanding , 1994 .

[14]  Tom Murray,et al.  Authoring tools for advanced technology learning environments : toward cost-effective adaptive, interactive and intelligent educational software , 2003 .

[15]  Daniel D. Suthers,et al.  Component-Based Construction of a Science Learning Space , 1998, Intelligent Tutoring Systems.

[16]  Robert J. Mislevy,et al.  Implications of Evidence‐Centered Design for Educational Testing , 2007 .

[17]  Rakesh Agarwal,et al.  Fast Algorithms for Mining Association Rules , 1994, VLDB 1994.

[18]  C. Hmelo‐Silver,et al.  Scaffolding and Achievement in Problem-Based and Inquiry Learning: A Response to Kirschner, Sweller, and Clark (2006) , 2007 .

[19]  Matthew Bachmann,et al.  Biology Microworld to Assess Students' Content Knowledge and Inquiry Skills and Leveraging Student Modeling to Prescribe Design Features for Scaffolding Learning , 2012 .

[20]  D. Klahr,et al.  All other things being equal: acquisition and transfer of the control of variables strategy. , 1999, Child development.

[21]  Robert J. Mislevy,et al.  Design and Discovery in Educational Assessment: Evidence-Centered Design, Psychometrics, and Educational Data Mining , 2012, EDM 2012.

[22]  Ryan Shaun Joazeiro de Baker,et al.  Developing a generalizable detector of when students game the system , 2008, User Modeling and User-Adapted Interaction.

[23]  Ian Witten,et al.  Data Mining , 2000 .

[24]  Ryan Shaun Joazeiro de Baker,et al.  Leveraging machine-learned detectors of systematic inquiry behavior to estimate and predict transfer of inquiry skill , 2011, User Modeling and User-Adapted Interaction.

[25]  L. Baum,et al.  Statistical Inference for Probabilistic Functions of Finite State Markov Chains , 1966 .

[26]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[27]  John R. Anderson,et al.  The generality/specificity of expertise in scientific reasoning , 1999, Cogn. Sci..

[28]  P. Reimann Detecting functional relations in a computerized discovery environment , 1991 .

[29]  Helen R. Quinn,et al.  A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas , 2013 .

[30]  Neil T. Heffernan,et al.  Detection and Analysis of Off-Task Gaming Behavior in Intelligent Tutoring Systems , 2006, Intelligent Tutoring Systems.

[31]  Sebastián Ventura,et al.  Educational Data Mining: A Review of the State of the Art , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[32]  Robert J. Mislevy,et al.  Putting ECD into Practice: The Interplay of Theory and Data in Evidence Models within a Digital Learning Environment , 2012, EDM 2012.

[33]  Secondary Education. Office for Career Massachusetts science and technology/engineering curriculum framework , 2006 .

[34]  D. Perkins Knowledge As Design , 1986 .

[35]  Barbara C. Buckley,et al.  Using Log Files to Track Students' Model-based Inquiry in Science , 2006, ICLS.

[36]  Ryan Shaun Joazeiro de Baker,et al.  Labeling Student Behavior Faster and More Precisely with Text Replays , 2008, EDM.

[37]  Neil T. Heffernan,et al.  Addressing the assessment challenge with an online system that tutors as it assesses , 2009, User Modeling and User-Adapted Interaction.

[38]  J. Frederiksen,et al.  Inquiry, Modeling, and Metacognition: Making Science Accessible to All Students , 1998 .

[39]  Cristina Conati,et al.  Discovering and Recognizing Student Interaction Patterns in Exploratory Learning Environments , 2010, Intelligent Tutoring Systems.

[40]  Joseph E. Beck,et al.  Tracking Students' Inquiry Paths through Student Transition Analysis , 2010, EDM.

[41]  J. E. Tschirgi,et al.  Sensible reasoning: A hypothesis about hypotheses. , 1980 .

[42]  Jennifer A. Fredricks,et al.  Inquiry in Project-Based Science Classrooms: Initial Attempts by Middle School Students , 1998 .

[43]  William F. Brewer,et al.  The Role of Anomalous Data in Knowledge Acquisition: A Theoretical Framework and Implications for Science Instruction , 1993 .

[44]  J. Hanley,et al.  The meaning and use of the area under a receiver operating characteristic (ROC) curve. , 1982, Radiology.

[45]  Gail P. Baxter,et al.  Science performance assessments: benchmarks and surrogates , 1994 .

[46]  John R. Anderson,et al.  Knowledge tracing: Modeling the acquisition of procedural knowledge , 2005, User Modeling and User-Adapted Interaction.

[47]  D. Kuhn Education for Thinking , 1986, Teachers College Record: The Voice of Scholarship in Education.

[48]  J. Kaczorowski,et al.  On the structure , 1999 .

[49]  Michael J. Timms,et al.  Assessment of Student Learning in Science Simulations and Games , 2009 .

[50]  Richard E. Clark,et al.  Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching , 2006 .

[51]  Ton de Jong,et al.  Supporting hypothesis generation by learners exploring an interactive computer simulation , 1991 .

[52]  Ryan Shaun Joazeiro de Baker,et al.  Using Text Replay Tagging to Produce Detectors of Systematic Experimentation Behavior Patterns , 2010, EDM.

[53]  Valerie J. Shute,et al.  A Large-Scale Evaluation of an Intelligent Discovery World: Smithtown , 1990, Interact. Learn. Environ..

[54]  Amelia Wenk Gotwals,et al.  Measuring Students' Scientific Content and Inquiry Reasoning , 2006, ICLS.

[55]  Allen Newell,et al.  Human Problem Solving. , 1973 .

[56]  Joel D. Martin,et al.  Student assessment using Bayesian nets , 1995, Int. J. Hum. Comput. Stud..

[57]  Maria Araceli Ruiz-Primo,et al.  Note On Sources of Sampling Variability in Science Performance Assessments , 1999 .

[58]  Joseph Krajcik,et al.  Middle school students’ use of appropriate and inappropriate evidence in writing scientific explanations , 2012 .

[59]  J. Meditch,et al.  Applied optimal control , 1972, IEEE Transactions on Automatic Control.

[60]  Jonathan P. Rowe,et al.  Modeling User Knowledge with Dynamic Bayesian Networks in Interactive Narrative Environments , 2010, AIIDE.

[61]  David Klahr,et al.  Dual Space Search During Scientific Reasoning , 1988, Cogn. Sci..

[62]  A. Newell Unified Theories of Cognition , 1990 .

[63]  R. Hambleton,et al.  An NCME Instructional Module on Comparison of Classical Test Theory and Item Response Theory and Their Applications to Test Development. , 2005 .

[64]  Dragan Gamberger,et al.  Combining Unsupervised and Supervised Machine Learning , 2001, AIME.

[65]  L. Schauble,et al.  Cross-Domain Development of Scientific Reasoning , 1992 .

[66]  Ryan Shaun Joazeiro de Baker,et al.  Identifying Students' Inquiry Planning Using Machine Learning , 2010, EDM.

[67]  Ton de Jong,et al.  Technological Advances in Inquiry Learning , 2006 .

[68]  James W. Pellegrino,et al.  Knowing What Students Know. , 2003 .

[69]  W. R. van Joolingen,et al.  SimQuest: authoring educational simulations. , 2003 .

[70]  Roy D. Pea,et al.  On the Cognitive Effects of Learning Computer Programming: A Critical Look. Technical Report No. 9. , 1987 .

[71]  R. Charles Murray,et al.  Reducing the Knowledge Tracing Space , 2009, EDM.

[72]  Leona Schauble,et al.  Students' Understanding of the Objectives and Procedures of Experimentation in the Science Classroom , 1995 .

[73]  Robert J. Mislevy,et al.  Evidence-Centered Design of Epistemic Games: Measurement Principles for Complex Learning Environments. , 2010 .

[74]  S. Messick The Interplay of Evidence and Consequences in the Validation of Performance Assessments , 1994 .

[75]  ReyeJim Student Modelling Based on Belief Networks , 2004 .

[76]  Marcia C. Linn,et al.  Helping students make controlled experiments more informative , 2010, ICLS.

[77]  Arin Ghazarian,et al.  Automatic detection of users’ skill levels using high-frequency user interface events , 2010, User Modeling and User-Adapted Interaction.

[78]  R. Glaser,et al.  Knowing What Students Know: The Science and Design of Educational Assessment , 2001 .

[79]  L. Crocker,et al.  Introduction to Classical and Modern Test Theory , 1986 .

[80]  Ton de Jong,et al.  An extended dual search space model of scientific discovery learning , 1997 .

[81]  Barbara Wasson,et al.  Learning by creating and exchanging objects: The SCY experience , 2010, Br. J. Educ. Technol..

[82]  C. Lebiere,et al.  The Atomic Components of Thought , 1998 .

[83]  Ian H. Witten,et al.  Data mining: practical machine learning tools and techniques, 3rd Edition , 1999 .

[84]  Michelene T. H. Chi,et al.  Eliciting Self-Explanations Improves Understanding , 1994, Cogn. Sci..

[85]  R. Shavelson,et al.  Rhetoric and reality in science performance assessments: An update. , 1996 .

[86]  D. Kuhn THE SKILLS OF ARGUMENT , 2008, Education for Thinking.

[87]  Peter Norvig,et al.  Artificial Intelligence: A Modern Approach , 1995 .

[88]  J. Shea National Science Education Standards , 1995 .

[89]  Paul Brna,et al.  Extending the scope of the student model , 1995, User Modeling and User-Adapted Interaction.

[90]  De Ayala,et al.  The Theory and Practice of Item Response Theory , 2008 .

[91]  Zachary A. Pardos,et al.  Ensembling predictions of student knowledge within intelligent tutoring systems , 2011, UMAP'11.

[92]  Cecilia L. Lopez,et al.  Assessment of Student Learning. , 1998 .

[93]  Jim Reye,et al.  Student Modelling Based on Belief Networks , 2004, Int. J. Artif. Intell. Educ..

[94]  Ryan Shaun Joazeiro de Baker,et al.  Contextual Slip and Prediction of Student Performance after Use of an Intelligent Tutor , 2010, UMAP.

[95]  Arnon Hershkovitz,et al.  Carelessness and Goal Orientation in a Science Microworld , 2011, AIED.

[96]  DIMITRIOS PIERRAKOS,et al.  User Modeling and User-Adapted Interaction , 1994, User Modeling and User-Adapted Interaction.

[97]  Matthew W. Lewis,et al.  Self-Explonations: How Students Study and Use Examples in Learning to Solve Problems , 1989, Cogn. Sci..

[98]  Zachary A. Pardos,et al.  Using Fine-Grained Skill Models to Fit Student Performance with , 2006 .

[99]  Ryan S. Baker,et al.  The State of Educational Data Mining in 2009: A Review and Future Visions. , 2009, EDM 2009.

[100]  Jos Beishuizen,et al.  Determinants of discovery learning in a complex simulation learning environment , 2005 .

[101]  A. Corbett,et al.  The Cambridge Handbook of the Learning Sciences: Cognitive Tutors , 2005 .

[102]  Moffat Mathews,et al.  Detecting Gaming the System in Constraint-Based Tutors , 2010, UMAP.

[103]  Paul Black,et al.  Testing: Friend or Foe?: Theory and Practice of Assessment and Testing , 1997 .

[104]  Tanja Mitrovic,et al.  Constraint-based tutors: a success story , 2001, AIED.

[105]  Ronald H. Stevens,et al.  Modeling the Development of Problem Solving Skills in Chemistry with a Web-Based Tutor , 2004, Intelligent Tutoring Systems.

[106]  Paul Horwitz,et al.  Looking inside the black box: assessing model-based learning and inquiry in BioLogicaTM , 2010, Int. J. Learn. Technol..

[107]  Kenneth R. Koedinger,et al.  Using Model-Tracing to Conduct Performance Assessment of Students' Inquiry Skills within a Microworld. , 2011 .

[108]  Leona Schauble,et al.  Scientific Reasoning Across Different Domains , 1992 .

[109]  Gary S. Kahn,et al.  Strategies for Knowledge Acquisition , 1985, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[110]  Russell G. Almond,et al.  On the Structure of Educational Assessments, CSE Technical Report. , 2003 .