Pupillometry and Head Distance to the Screen to Predict Skill Acquisition During Information Visualization Tasks

In this paper we investigate using a variety of behavioral measures collectible with an eye tracker to predict a user's skill acquisition phase while performing various information visualization tasks with bar graphs. Our long term goal is to use this information in real-time to create user-adaptive visualizations that can provide personalized support to facilitate visualization processing based on the user's predicted skill level. We show that leveraging two additional content-independent data sources, namely information on a user's pupil dilation and head distance to the screen, yields a significant improvement for predictive accuracies of skill acquisition compared to predictions made using content-dependent information related to user eye gaze attention patterns, as was done in previous work. We show that including features from both pupil dilation and head distance to the screen improve the ability to predict users' skill acquisition state, beating both the baseline and a model using only content-dependent gaze information.

[1]  Brian P. Bailey,et al.  Towards an index of opportunity: understanding changes in mental workload during task execution , 2004, CHI.

[2]  Max Kuhn,et al.  Building Predictive Models in R Using the caret Package , 2008 .

[3]  Zhen Wen,et al.  Behavior-driven visualization recommendation , 2009, IUI.

[4]  E. Hess,et al.  Pupil Size in Relation to Mental Activity during Simple Problem-Solving , 1964, Science.

[5]  Sidney K. D'Mello,et al.  Automatic Gaze-Based Detection of Mind Wandering with Metacognitive Awareness , 2015, UMAP.

[6]  Tiffany Barnes,et al.  Toward Automatic Hint Generation for Logic Proof Tutoring Using Historical Student Data , 2008, Intelligent Tutoring Systems.

[7]  Mike Walsh,et al.  Exploring regional futures: Lessons from Metropolitan Chicago's online MetroQuest , 2014 .

[8]  Arzu Çöltekin,et al.  Exploring the efficiency of users' visual analytics strategies based on sequence analysis of eye movement recordings , 2010, Int. J. Geogr. Inf. Sci..

[9]  Patrick Langdon,et al.  Comparing Ocular Parameters for Cognitive Load Measurement in Eye-Gaze-Controlled Interfaces for Automotive and Desktop Computing Environments , 2016, Int. J. Hum. Comput. Interact..

[10]  Ying Zhu,et al.  Measuring Effective Data Visualization , 2007, ISVC.

[11]  Lorenzo Vigentini,et al.  Visualising Virtual Learning Environments: Case Studies of the Website Exploration Tool , 2010, 2010 14th International Conference Information Visualisation.

[12]  J Hyönä,et al.  Pupil Dilation as a Measure of Processing Load in Simultaneous Interpretation and Other Language Tasks , 1995, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[13]  Cristina Conati,et al.  Evaluating the Impact of User Characteristics and Different Layouts on an Interactive Visualization for Decision Making , 2014, Comput. Graph. Forum.

[14]  Kenneth R. Koedinger,et al.  Performance Factors Analysis - A New Alternative to Knowledge Tracing , 2009, AIED.

[15]  Maneesh Agrawala,et al.  Extracting references between text and charts via crowdsourcing , 2014, CHI.

[16]  Sandra P Marshall,et al.  Identifying cognitive state from eye metrics. , 2007, Aviation, space, and environmental medicine.

[17]  Andy P. Field,et al.  Discovering Statistics Using Ibm Spss Statistics , 2017 .

[18]  Cristina Conati,et al.  Individual user characteristics and information visualization: connecting the dots through eye tracking , 2013, CHI.

[19]  Cristina Conati,et al.  Predicting Confusion in Information Visualization from Eye Tracking and Interaction Data , 2016, IJCAI.

[20]  Robert Kosara,et al.  The Shaping of Information by Visual Metaphors , 2008, IEEE Transactions on Visualization and Computer Graphics.

[21]  Andrew Olney,et al.  Gaze tutor: A gaze-reactive intelligent tutoring system , 2012, Int. J. Hum. Comput. Stud..

[22]  Cristina Conati,et al.  Comparing and Combining Eye Gaze and Interface Actions for Determining User Learning with an Interactive Simulation , 2013, UMAP.

[23]  James R. Eagan,et al.  Low-level components of analytic activity in information visualization , 2005, IEEE Symposium on Information Visualization, 2005. INFOVIS 2005..

[24]  Scott P. Robertson,et al.  Proceedings of the SIGCHI Conference on Human Factors in Computing Systems , 1991 .

[25]  Cristina Conati,et al.  Supporting interface customization using a mixed-initiative approach , 2007, IUI '07.

[26]  Andrea Lockerd Thomaz,et al.  Cheese: tracking mouse movement activity on websites, a tool for user modeling , 2001, CHI Extended Abstracts.

[27]  Peter Brusilovsky,et al.  Adaptive Knowledge-Based Visualization for Accessing Educational Examples , 2006, Tenth International Conference on Information Visualisation (IV'06).

[28]  Cristina Conati,et al.  Eye Gaze in Intelligent User Interfaces , 2013, Springer London.

[29]  J. Beatty Task-evoked pupillary responses, processing load, and the structure of processing resources. , 1982, Psychological bulletin.

[30]  Sidney K. D'Mello,et al.  Evaluation of a Personalized Method for Proactive Mind Wandering Reduction , 2014, UMAP Workshops.

[31]  Minoru Nakayama,et al.  Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation , 2009, Universal Access in the Information Society.

[32]  P. Maeyer,et al.  Study of the attentive behavior of novice and expert map users using eye tracking , 2014 .

[33]  Cristina Conati,et al.  Predicting Affect from Gaze Data during Interaction with an Intelligent Tutoring System , 2014, Intelligent Tutoring Systems.

[34]  Cristina Conati,et al.  Inferring Learning from Gaze Data during Interaction with an Environment to Support Self-Regulated Learning , 2013, AIED.

[35]  Cristina Conati,et al.  Constructing Models of User and Task Characteristics from Eye Gaze Data for User-Adaptive Information Highlighting , 2015, AAAI.

[36]  S. P. Marshall,et al.  The Index of Cognitive Activity: measuring cognitive workload , 2002, Proceedings of the IEEE 7th Conference on Human Factors and Power Plants.

[37]  Joseph E. Beck,et al.  Using Knowledge Tracing in a Noisy Environment to Measure Student Reading Proficiencies , 2006, Int. J. Artif. Intell. Educ..

[38]  Pascual Martínez-Gómez,et al.  Recognition of understanding level and language skill using measurements of reading behavior , 2014, IUI.

[39]  Cristina Conati,et al.  Towards facilitating user skill acquisition: identifying untrained visualization users through eye tracking , 2014, IUI.

[40]  Francesco Ricci,et al.  User Modeling, Adaptation, and Personalization , 2013, Lecture Notes in Computer Science.

[41]  Cristina Conati,et al.  Inferring Visualization Task Properties, User Performance, and User Cognitive Abilities from Eye Gaze Data , 2014, ACM Trans. Interact. Intell. Syst..

[42]  Yiannis Kompatsiaris,et al.  An eye-tracking-based approach to facilitate interactive video search , 2011, ICMR '11.

[43]  Slava Kalyuga,et al.  Enhancing Instructional Efficiency of Interactive E-learning Environments: A Cognitive Load Perspective , 2007 .

[44]  K. Kirsner,et al.  Beyond the Learning Curve: The Construction of Mind , 2005 .

[45]  Arthur C. Graesser,et al.  Predicting Affective States expressed through an Emote-Aloud Procedure from AutoTutor's Mixed-Initiative Dialogue , 2006, Int. J. Artif. Intell. Educ..

[46]  J. Gregory Trafton,et al.  A preliminary system for recognizing boredom , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[47]  H. Akaike Factor analysis and AIC , 1987 .

[48]  Cristina Conati,et al.  Prediction of Users' Learning Curves for Adaptation while Using an Information Visualization , 2015, IUI.

[49]  Sébastien Lallé,et al.  Comparing Student Models in Different Formalisms by Predicting Their Impact on Help Success , 2013, AIED.

[50]  Scott E. Hudson,et al.  Dynamic detection of novice vs. skilled use without a task model , 2007, CHI.

[51]  Cristina Conati,et al.  Highlighting interventions and user differences: informing adaptive information visualization support , 2014, CHI.

[52]  Cristina Conati,et al.  Eye Tracking to Understand User Differences in Visualization Processing with Highlighting Interventions , 2014, UMAP.

[53]  Arthur C. Graesser,et al.  Better to be frustrated than bored: The incidence, persistence, and impact of learners' cognitive-affective states during interactions with three different computer-based learning environments , 2010, Int. J. Hum. Comput. Stud..

[54]  Cristina Conati,et al.  Towards User-Adaptive Information Visualization , 2015, AAAI.

[55]  Jean-Daniel Fekete,et al.  A Principled Way of Assessing Visualization Literacy , 2014, IEEE Transactions on Visualization and Computer Graphics.

[56]  Roman Bednarik,et al.  A Computational Approach for Prediction of Problem-Solving Behavior Using Support Vector Machines and Eye-Tracking Data , 2013, Eye Gaze in Intelligent User Interfaces.

[57]  Cristina Conati,et al.  When to Adapt: Detecting User's Confusion During Visualization Processing , 2013, UMAP Workshops.

[58]  Brian D. Fisher,et al.  Impact of personality factors on interface interaction and the development of user profiles: Next steps in the personal equation of interaction , 2012, Inf. Vis..

[59]  Vincent Aleven,et al.  More Accurate Student Modeling through Contextual Estimation of Slip and Guess Probabilities in Bayesian Knowledge Tracing , 2008, Intelligent Tutoring Systems.

[60]  Chris North,et al.  An insight-based methodology for evaluating bioinformatics visualizations , 2005, IEEE Transactions on Visualization and Computer Graphics.