Combining Machine Learning and Qualitative Methods to Elaborate Students’ Ideas About the Generality of their Model-Based Explanations

Assessing students’ participation in science practices presents several challenges, especially when aiming to differentiate meaningful (vs. rote) forms of participation. In this study, we sought to use machine learning (ML) for a novel purpose in science assessment: developing a construct map for students’ consideration of generality, a key epistemic understanding that undergirds meaningful participation in knowledge-building practices. We report on our efforts to assess the nature of 845 students’ ideas about the generality of their model-based explanations through the combination of an embedded written assessment and a novel data analytic approach that combines unsupervised and supervised machine learning methods and human-driven, interpretive coding. We demonstrate how unsupervised machine learning methods, when coupled with qualitative, interpretive coding, were used to revise our construct map for generality in a way that allowed for a more nuanced evaluation that was closely tied to empirical patterns in the data. We also explored the application of the construct map as a framework for coding used as a part of supervised machine learning methods, finding that it demonstrates some viability for use in future analyses. We discuss implications for the assessment of students’ meaningful participation in science practices in terms of their considerations of generality, the role of unsupervised methods in science assessment, and combining machine learning and human-driven approach for understanding students’ complex involvement in science practices.

[1]  Ngss Lead States Next generation science standards : for states, by states , 2013 .

[2]  William A. Sandoval,et al.  The Quality of Students' Use of Evidence in Written Scientific Explanations , 2005 .

[3]  James W Pellegrino,et al.  Proficiency in Science: Assessment Challenges and Opportunities , 2013, Science.

[4]  K. Popper The Propensity Interpretation of Probability , 1959 .

[5]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[6]  Elijah Mayfield,et al.  Transforming Biology Assessment with Machine Learning: Automated Scoring of Written Evolutionary Explanations , 2012 .

[7]  Bruce L Sherin,et al.  A Computational Study of Commonsense Science: An Exploration in the Automated Analysis of Clinical Interview Data , 2013 .

[8]  Janice D. Gobert,et al.  From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining , 2013, Journal of the Learning Sciences.

[9]  Janet L. Kolodner,et al.  Case-Based Learning , 1993, Springer US.

[10]  T. Campbell Developing and Using Models , 2013 .

[11]  Sheila M. Fram The Constant Comparative Analysis Method Outside of Grounded Theory , 2013 .

[12]  W. Sandoval Understanding Students' Practical Epistemologies and Their Influence on Learning Through Inquiry , 2005 .

[13]  P. Black,et al.  A construct-modeling approach to develop a learning progression of how students understand the structure of matter , 2017 .

[14]  R. G. Duncan,et al.  Designing project-based instruction to foster generative and mechanistic understandings in genetics , 2011 .

[15]  W. Penuel,et al.  Developing tasks to assess phenomenon‐based science learning: Challenges and lessons learned from building proximal transfer tasks , 2019, Science Education.

[16]  Paul Thagard,et al.  The Best Explanation: Criteria for Theory Choice , 1978 .

[17]  Johnny Saldaña,et al.  The Coding Manual for Qualitative Researchers , 2009 .

[18]  Benno Stein,et al.  The Eras and Trends of Automatic Short Answer Grading , 2015, International Journal of Artificial Intelligence in Education.

[19]  Amelia Wenk Gotwals,et al.  Validity Evidence for Learning Progression‐Based Assessment Items That Fuse Core Disciplinary Ideas and Science Practices , 2013 .

[20]  William A. Sandoval,et al.  Improvements to elementary children's epistemic understanding from sustained argumentation , 2012 .

[21]  C. Chinn,et al.  Epistemologically Authentic Inquiry in Schools: A Theoretical Framework for Evaluating Inquiry Tasks , 2002 .

[22]  Christopher J. Harris,et al.  Designing NGSS Assessments to Evaluate the Efficacy of Curriculum Interventions , 2013 .

[23]  Lawrence P. Gallagher,et al.  Impact of project-based curriculum materials on student learning in science: Results of a randomized controlled trial , 2015 .

[24]  K. Salmela‐Aro,et al.  High school students' situational engagement associated with scientific practices in designed science learning situations , 2020 .

[25]  Christopher J. Harris,et al.  Designing Knowledge‐In‐Use Assessments to Promote Deeper Learning , 2019, Educational Measurement: Issues and Practice.

[26]  Fabian Zehner,et al.  Automatic Coding of Short Text Responses via Clustering in Educational Assessment , 2016, Educational and psychological measurement.

[27]  Christina Krist Supplemental Material for Examining How Classroom Communities Developed Practice-Based Epistemologies for Science Through Analysis of Longitudinal Video Data , 2020 .

[28]  Jacob Cohen,et al.  Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit. , 1968 .

[29]  Joseph Krajcik,et al.  Supporting Students' Construction of Scientific Explanations by Fading Scaffolds in Instructional Materials , 2006 .

[30]  David A. Gillam,et al.  A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas , 2012 .

[31]  Brian J. Reiser,et al.  Steering the Course of Dialogue in Inquiry-based Science Classrooms. , 1999 .

[32]  D. Kuhn Metacognitive Development , 2000 .

[33]  Lars R. Bergman,et al.  Studying Individual Patterns of Development Using I-States as Objects Analysis (ISOA) , 1999 .

[34]  L. Berland,et al.  Epistemological Trade‐Offs: Accounting for Context When Evaluating Epistemological Sophistication of Student Engagement in Scientific Practices , 2016 .

[35]  Luke Stark,et al.  Better, Nicer, Clearer, Fairer: A Critical Assessment of the Movement for Ethical Artificial Intelligence and Machine Learning , 2019, HICSS.

[36]  R. Lenth Studying Individual Development in an Interindividual Context: A Person-Oriented Approach (Book) , 2004 .

[37]  Mark Wilson,et al.  Constructing Measures: An Item Response Modeling Approach , 2004 .

[38]  Christopher J. Harris,et al.  Implementing the Next Generation Science Standards , 2015 .

[39]  J. R. Landis,et al.  An application of hierarchical kappa-type statistics in the assessment of majority agreement among multiple observers. , 1977, Biometrics.

[40]  Minsu Ha,et al.  Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance? , 2013, Journal of Science Education and Technology.

[41]  K. Crowley,et al.  Designing for Science: Implications from Everyday, Classroom, and Professional Settings. , 2001 .

[42]  James W. Pellegrino,et al.  Applying machine learning in science assessment: a systematic review , 2020, Studies in Science Education.

[43]  Lars R. Bergman,et al.  Studying Individual Development in an Interindividual Context: A Person-Oriented Approach , 2002 .

[44]  Eve Manz,et al.  Understanding the codevelopment of modeling practice and ecological knowledge , 2012 .

[45]  Helen R. Quinn,et al.  A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas , 2013 .

[46]  James W. Pellegrino,et al.  Developing Assessments for the Next Generation Science Standards. , 2014 .

[47]  Allison J. Jaeger,et al.  Different Approaches to Assessing the Quality of Explanations Following a Multiple-Document Inquiry Activity in Science , 2017, International Journal of Artificial Intelligence in Education.

[48]  Milan Bouchet-Valat,et al.  Snowball stemmers based on the C libstemmer UTF-8 library , 2014 .

[49]  M. J. Ford,et al.  Educational Implications of Choosing “Practice” to Describe Science in the Next Generation Science Standards , 2015 .

[50]  Hee-Sun Lee,et al.  Using automatic image processing to analyze visual artifacts created by students in scientific argumentation , 2019, Br. J. Educ. Technol..

[51]  Janice D. Gobert,et al.  Using educational data mining to assess students’ skills at designing and conducting experiments within a complex systems microworld , 2015 .

[52]  P. Shawn Irvin,et al.  Evaluating Content‐Related Validity Evidence Using a Text‐Based Machine Learning Procedure , 2020 .

[53]  B. Reiser,et al.  Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners , 2009 .

[54]  Katherine L. McNeill,et al.  Learning‐goals‐driven design model: Developing curriculum materials that align with national standards and incorporate project‐based pedagogy , 2008 .

[55]  R. Duschl,et al.  "Doing the Lesson" or "Doing Science": Argument in High School Genetics , 2000 .

[56]  Gregory J. Kelly,et al.  Inquiry, Activity and Epistemic Practice , 2008 .

[57]  Libby Gerard,et al.  Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry , 2016 .

[58]  Andrés Montoyo,et al.  Advances on natural language processing , 2007, Data Knowl. Eng..

[59]  Peter Dalgaard,et al.  R Development Core Team (2010): R: A language and environment for statistical computing , 2010 .

[60]  Eve Manz,et al.  Representing Student Argumentation as Functionally Emergent From Scientific Activity , 2015 .

[61]  Shawn Y. Stevens,et al.  Tracking student learning over time using construct-centred design , 2010 .

[62]  Pratim Sengupta,et al.  Development of Mechanistic Reasoning and Multilevel Explanations of Ecology in Third Grade Using Agent-Based Models. , 2016 .

[63]  Michael J. Ford,et al.  Chapter 1: Redefining Disciplinary Learning in Classroom Contexts , 2006 .

[64]  Sonia M. Underwood,et al.  Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol , 2016, PloS one.

[65]  Christina V. Schwarz,et al.  Epistemologies in practice: Making scientific practices meaningful for students , 2016 .

[66]  L. Schauble,et al.  Cultivating Model-Based Reasoning in Science Education , 2005 .

[67]  Christina V. Schwarz,et al.  Exploring the Effect of Embedded Scaffolding Within Curricular Tasks on Third-Grade Students’ Model-Based Explanations about Hydrologic Cycling , 2015 .

[68]  Laura K. Nelson,et al.  Computational Grounded Theory: A Methodological Framework , 2020 .

[69]  R. Baker,et al.  Operationalizing and Detecting Disengagement Within Online Science Microworlds , 2015 .

[70]  Anne Kuefer,et al.  Explaining Science A Cognitive Approach , 2016 .