Understanding students' performance in a computer-based assessment of complex problem solving: An analysis of behavioral data from computer-generated log files

Computer-based assessments of complex problem solving (CPS) that have been used in international large-scale surveys require students to engage in an in-depth interaction with the problem environment. In this, they evoke manifest sequences of overt behavior that are stored in computer-generated log files. In the present study, we explored the relation between several overt behaviors, which Nź=ź1476 Finnish ninth-grade students (mean ageź=ź15.23, SDź=ź.47 years) exhibited when exploring a CPS environment, and their CPS performance. We used the MicroDYN approach to measure CPS and inspected students' behaviors through log-file analyses. Results indicated that students who occasionally observed the problem environment in a noninterfering way in addition to actively exploring it (noninterfering observation) showed better CPS performance, whereas students who showed a high frequency of (potentially unplanned) interventions (intervention frequency) exhibited worse CPS performance. Additionally, both too much and too little time spent on a CPS task (time on task) was associated with poor CPS performance. The observed effects held after controlling for students' use of an exploration strategy that required a sequence of multiple interventions (VOTAT strategy) indicating that these behaviors exhibited incremental effects on CPS performance beyond the use of VOTAT. Complex Problem Solving (CPS) tasks require students to engage into in-depth interaction with the environment.We investigated how behavioral indicators captured in computer-generated log files related to CPS performance.The relation between time on task and CPS performance followed an inverted u-shape.Noninterfering observation was positively, high intervention frequency negatively related to CPS performance.Relations decreased in size but remained stable after controlling for a multistep exploration strategy (VOTAT).

[1]  Samuel Greiff,et al.  The Role of Strategy Knowledge for the Application of Strategies in Complex Problem Solving Tasks , 2014, Technol. Knowl. Learn..

[2]  K. Holyoak,et al.  The Cambridge handbook of thinking and reasoning , 2005 .

[3]  Jarkko Hautamäki,et al.  The Role of Time on Task in Computer-Based Low-Stakes Assessment of Cross-Curricular Skills. , 2014 .

[4]  S. Ian Robertson,et al.  Problem-solving , 2001, Human Thinking.

[5]  D. Kuhn,et al.  Is Developing Scientific Thinking All About Learning to Control Variables? , 2005, Psychological science.

[6]  Samuel Greiff,et al.  Dynamic Problem Solving , 2012 .

[7]  Samuel Greiff,et al.  Perspectives on problem solving in cognitive research and educational assessment: analytical, interactive, and collaborative problem solving , 2013 .

[8]  Samuel Greiff,et al.  Dynamic Problem Solving: A new measurement perspective , 2012 .

[9]  A. Graesser,et al.  Domain-general problem solving skills and education in the 21st century , 2014 .

[10]  Laurence Steinberg,et al.  Age differences in strategic planning as indexed by the tower of London. , 2011, Child development.

[11]  J. Funke Dynamic systems as tools for analysing human judgement , 2001 .

[12]  Samuel Greiff,et al.  More is not always better. The relation between response and response time and their moderation by item and person characteristics in Raven’s matrices , 2015 .

[13]  P. Frensch,et al.  Definitions, traditions, and a general framework for understanding complex problem solving , 1995 .

[14]  J. E. Tschirgi,et al.  Sensible reasoning: A hypothesis about hypotheses. , 1980 .

[15]  Samuel Greiff,et al.  Complex problem solving : More than reasoning ? , 2017 .

[16]  David H. Jonassen,et al.  Learning to Solve Problems , 2003 .

[17]  Samuel Greiff,et al.  The Process of Solving Complex Problems , 2012, J. Probl. Solving.

[18]  Mari-Pauliina Vainikainen,et al.  Finnish primary school pupils performance in learning to learn assessments : A longitudinal perspective on educational equity , 2014 .

[19]  Peter J. Fensham,et al.  Programme for International Student Assessment (PISA) , 2014 .

[20]  Samuel Greiff,et al.  Perspectives on Problem Solving in Educational Assessment: Analytical, Interactive, and Collaborative Problem Solving , 2013, J. Probl. Solving.

[21]  P. Bentler,et al.  Cutoff criteria for fit indexes in covariance structure analysis : Conventional criteria versus new alternatives , 1999 .

[22]  Samuel Greiff,et al.  Validity of the MicroDYN approach: Complex problem solving predicts school grades beyond working memory capacity , 2013 .

[23]  Jan L. Plass,et al.  Intelligence Assessment with Computer Simulations. , 2005 .

[24]  William Revelle,et al.  Cronbach’s α, Revelle’s β, and Mcdonald’s ωH: their relations with each other and two alternative conceptualizations of reliability , 2005 .

[25]  Dietrich Doerner,et al.  On the Difficulties People Have in Dealing With Complexity , 1980 .

[26]  Romain Martin,et al.  Intérêts et limites de la chronométrie mentale dans la mesure psychologique , 2002 .

[27]  Harald Schaub,et al.  Errors in Planning and Decision‐making and the Nature of Human Information Processing , 1994 .

[28]  Steve Croker,et al.  Scientific reasoning in a real-world context: the effect of prior belief and outcome on children's hypothesis-testing strategies. , 2011, The British journal of developmental psychology.

[29]  K. Holyoak,et al.  The Impact of Goal Specificity on Strategy Use and the Acquisition of Problem Structure , 1996 .

[30]  M. Osman Controlling uncertainty: a review of human behavior in complex dynamic environments. , 2010, Psychological bulletin.

[31]  Annette Kluge,et al.  Performance Assessments With Microworlds and Their Difficulty , 2008 .

[32]  J. B. Olsen,et al.  THE FOUR GENERATIONS OF COMPUTERIZED EDUCATIONAL MEASUREMENT , 1988 .

[33]  James A. Bovaird,et al.  On the Merits of Orthogonalizing Powered and Product Terms: Implications for Modeling Interactions Among Latent Variables , 2006 .

[34]  Samuel Greiff,et al.  Exploring the Relation between Time on Task and Ability in Complex Problem Solving , 2015 .

[35]  Heiko Rölke,et al.  The time on task effect in reading and problem solving is moderated by task difficulty and skill: Insights from a computer-based large-scale assessment. , 2014 .

[36]  André Beauducel,et al.  On the Performance of Maximum Likelihood Versus Means and Variance Adjusted Weighted Least Squares Estimation in CFA , 2006 .

[37]  Christian D. Schunn,et al.  Strategies for success: uncovering what makes students successful in design and learning , 2013 .

[38]  Yulia Dodonova,et al.  Processing speed and intelligence as predictors of school achievement: Mediation or unique contribution? , 2012 .

[39]  Johannes Naumann,et al.  More is not Always Better: The Relation between Item Response and Item Response Time in Raven's Matrices , 2015 .

[40]  D. Leutner,et al.  Metacognitive Knowledge About and Metacognitive Regulation of Strategy Use in Self-Regulated Scientific Discovery Learning: New Methods of Assessment in Computer-Based Learning Environments , 2013 .

[41]  Samuel Greiff,et al.  A multitrait-multimethod study of assessment instruments for complex problem solving☆ , 2013 .

[42]  Craig K. Enders,et al.  Applied Missing Data Analysis , 2010 .

[43]  Samuel Greiff,et al.  Computer-generated log-file analyses as a window into students' minds? A showcase study based on the PISA 2012 assessment of problem solving , 2015, Comput. Educ..

[44]  Ronny Scherer,et al.  The relations among openness, perseverance, and performance in creative problem solving: A substantive-methodological approach☆ , 2015 .

[45]  Samuel Greiff,et al.  The Computer-Based Assessment of Complex Problem Solving and How It Is Influenced by Students' Information and Communication Technology Literacy. , 2014 .