The Predictive Power of Eye-Tracking Data in an Interactive AR Learning Environment

Learning through embodiment is a promising concept, potentially capable to remove many layers of abstraction hindering the learning process. Walk the Graph, our HoloLens2-based AR application, provides an inquiry-based learning setting for understanding graphs through the full-body movement of the user. In this paper, as part of our ongoing work to build an AI framework to quantify and predict the learning gain of the user, we examine the predictive potential of gaze data collected during the app usage. To classify users into groups with different learning gains, we construct a map of areas of interest (AOI) based on the gaze data itself. Subsequently, using a sliding window approach, we extract engineered features from the collected in-app as well as gaze data. Our experimental results have shown that a Support Vector Machine with selected features achieved the highest F1 score (0.658; baseline: 0.251) compared to other approaches including a K-Nearest Neighbor and a Random Forest Classifier although in each of the cases the lion’s share of the predictive power is indeed provided by the gaze-based features.

[1]  Why Google Can’t Save Us , 2018 .

[2]  Benjamin Strobel,et al.  Task-irrelevant data impair processing of graph reading tasks: An eye tracking study , 2018, Learning and Instruction.

[3]  P. Klein,et al.  Test of understanding graphs in kinematics: Item objectives confirmed by clustering eye movement transitions , 2021 .

[4]  Mary Hegarty,et al.  Spatial Visualization in Physics Problem Solving , 2007, Cogn. Sci..

[5]  Daniel Sonntag,et al.  ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays , 2021, Sensors.

[6]  S. Küchemann,et al.  Student understanding of graph slope and area under a curve: A replication study comparing first-year physics and economics students , 2019, Physical Review Physics Education Research.

[7]  Wolff-Michael Roth,et al.  Lecturing graphing: What features of lectures contribute to student difficulties in learning to interpret graph? , 1998 .

[8]  Peter C.-H. Cheng,et al.  Modeling the Effect of Task and Graphical Representation on Response Latency in a Graph Reading Task , 2003, Hum. Factors.

[9]  Ana Susac,et al.  Student understanding of graph slope and area under a graph: A comparison of physics and nonphysics students , 2018, Physical Review Physics Education Research.

[10]  Jochen Kuhn,et al.  Classification of Students' Conceptual Understanding in STEM Education using Their Visual Attention Distributions: A Comparison of Three Machine-Learning Approaches , 2020, CSEDU.

[11]  Frances R. Curcio,et al.  Comprehension of Mathematical Relationships Expressed in Graphs. , 1987 .

[12]  S. Brückner,et al.  Changes in Students’ Understanding of and Visual Attention on Digitally Represented Graphs Across Two Domains in Higher Education: A Postreplication Study , 2020, Frontiers in Psychology.

[13]  Bronwen Cowie,et al.  Exploring the challenge of developing student teacher data literacy , 2017, Developing Teachers' Assessment Capacity.

[14]  Joseph H. Goldberg,et al.  Eye tracking for visualization evaluation: Reading values on linear versus radial graphs , 2011, Inf. Vis..