A gaze-based learning analytics model: in-video visual feedback to improve learner's attention in MOOCs

In the context of MOOCs, "With-me-ness" refers to the extent to which the learner succeeds in following the teacher, specifically in terms of looking at the area in the video that the teacher is explaining. In our previous works, we employed eye-tracking methods to quantify learners' With-me-ness and showed that it is positively correlated with their learning gains. In this contribution, we describe a tool that is designed to improve With-me-ness by providing a visual-aid superimposed on the video. The position of the visual-aid is suggested by the teachers' dialogue and deixis, and it is displayed when the learner's With-me-ness is under the average value, which is computed from the other students' gaze behavior. We report on a user-study that examines the effectiveness of the proposed tool. The results show that it significantly improves the learning gain and it significantly increases the extent to which the students follow the teacher. Finally, we demonstrate how With-me-ness can create a complete theoretical framework for conducting gaze-based learning analytics in the context of MOOCs.

[1]  Kshitij Sharma Gaze Analysis methods for Learning Analytics , 2015 .

[2]  Doug Clow,et al.  The learning analytics cycle: closing the loop effectively , 2012, LAK.

[3]  Hua Wang,et al.  Empathic tutoring software agents using real-time eye tracking , 2006, ETRA.

[4]  Daniel C. Richardson,et al.  The Art of Conversation Is Coordination , 2007, Psychological science.

[5]  Andrew Olney,et al.  Gaze tutor: A gaze-reactive intelligent tutoring system , 2012, Int. J. Hum. Comput. Stud..

[6]  Harold Fox,et al.  Evaluating look-to-talk: a gaze-aware interface in a collaborative environment , 2002, CHI Extended Abstracts.

[7]  Patrick Jermann,et al.  Effects of sharing text selections on gaze cross-recurrence and interaction quality in a pair programming task , 2012, CSCW.

[8]  Darren Gergle,et al.  See what i'm saying?: using Dyadic Mobile Eye tracking to study collaborative reference , 2011, CSCW.

[9]  W. Levelt,et al.  Viewing and naming objects: eye movements during noun phrase production , 1998, Cognition.

[10]  Ulrik Schroeder,et al.  A reference model for learning analytics , 2012 .

[11]  Cristina Conati,et al.  Predicting Affect from Gaze Data during Interaction with an Intelligent Tutoring System , 2014, Intelligent Tutoring Systems.

[12]  Zenzi M. Griffin,et al.  PSYCHOLOGICAL SCIENCE Research Article WHAT THE EYES SAY ABOUT SPEAKING , 2022 .

[13]  Francis K. H. Quek,et al.  Interacting with stories , 2009, WOCCI '09.

[14]  Paul D. Allopenna,et al.  Tracking the Time Course of Spoken Word Recognition Using Eye Movements: Evidence for Continuous Mapping Models , 1998 .

[15]  Ramin Samadani,et al.  ConnectBoard: A remote collaboration system that supports gaze-aware interaction and sharing , 2009, 2009 IEEE International Workshop on Multimedia Signal Processing.