User-adaptive information visualization: using eye gaze data to infer visualization tasks and user cognitive abilities

Information Visualization systems have traditionally followed a one-size-fits-all model, typically ignoring an individual user's needs, abilities and preferences. However, recent research has indicated that visualization performance could be improved by adapting aspects of the visualization to each individual user. To this end, this paper presents research aimed at supporting the design of novel user-adaptive visualization systems. In particular, we discuss results on using information on user eye gaze patterns while interacting with a given visualization to predict the user's visualization tasks, as well as user cognitive abilities including perceptual speed, visual working memory, and verbal working memory. We show that such predictions are significantly better than a baseline classifier even during the early stages of visualization usage. These findings are discussed in view of designing visualization systems that can adapt to each individual user in real-time.

[1]  Leonard S. Cahen,et al.  Educational Testing Service , 1970 .

[2]  Robert W. Kentridge,et al.  Eye movement research : mechanisms, processes and applications , 1995 .

[3]  John T. Stasko,et al.  Low-level components of analytic activity in information visualization , 2005, IEEE Symposium on Information Visualization, 2005. INFOVIS 2005..

[4]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[5]  Cristina Conati,et al.  Exploring the role of individual differences in information visualization , 2008, AVI '08.

[6]  Andrew Dillon,et al.  Expertise and the perception of shape in information , 1996 .

[7]  Shahram Eivazi,et al.  Predicting Problem-Solving Behavior and Performance Levels from Visual Attention Data , 2011 .

[8]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[9]  Brian D. Fisher,et al.  Towards the Personal Equation of Interaction: The impact of personality factors on visual analytics interface interaction , 2010, 2010 IEEE Symposium on Visual Analytics Science and Technology.

[10]  E. Vogel,et al.  Human Variation in Overriding Attentional Capture , 2009, The Journal of Neuroscience.

[11]  Cristina Conati,et al.  Towards adaptive information visualization: on the influence of user characteristics , 2012, UMAP.

[12]  Joseph H. Goldberg,et al.  Comparing information graphics: a critical look at eye tracking , 2010, BELIV '10.

[13]  Ruth B. Ekstrom,et al.  Manual for kit of factor-referenced cognitive tests , 1976 .

[14]  Beate Grawemeyer,et al.  Evaluation of ERST - An External Representation Selection Tutor , 2006, Diagrams.

[15]  Esma Aïmeur,et al.  Activity recognition using eye-gaze movements and traditional interactions , 2011, Interact. Comput..

[16]  Colin Ware,et al.  Zooming versus multiple window interfaces: Cognitive costs of visual comparisons , 2006, TCHI.

[17]  Rafael Cabeza,et al.  Evaluation of pupil center-eye corner vector for gaze estimation using a web cam , 2012, ETRA '12.

[18]  Stephen M. Casner,et al.  Task-analytic approach to the automated design of graphic presentations , 1991, TOGS.

[19]  J. Jacko,et al.  The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications , 2002 .

[20]  Cristina Conati,et al.  Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation , 2007, Knowl. Based Syst..

[21]  A. Jameson Adaptive interfaces and agents , 2002 .

[22]  R. Engle,et al.  Is working memory capacity task dependent , 1989 .

[23]  Zhen Wen,et al.  Behavior-driven visualization recommendation , 2009, IUI.

[24]  Ilpo Kojo,et al.  Using hidden Markov model to uncover processing states from eye movements in information search tasks , 2008, Cognitive Systems Research.

[25]  Jock D. Mackinlay,et al.  Automating the design of graphical presentations of relational information , 1986, TOGS.

[26]  Cristina Conati,et al.  Exploring gaze data for determining user learning with an interactive simulation , 2012, UMAP.

[27]  Brian P. Bailey,et al.  Using Eye Gaze Patterns to Identify User Tasks , 2004 .

[28]  Joseph H. Goldberg,et al.  Eye tracking for visualization evaluation: Reading values on linear versus radial graphs , 2011, Inf. Vis..

[29]  Ingrid Zukerman,et al.  The automated understanding of simple bar charts , 2011, Artif. Intell..

[30]  William Ribarsky,et al.  How locus of control influences compatibility with visualization style , 2011, 2011 IEEE Conference on Visual Analytics Science and Technology (VAST).

[31]  Vincent P. Wade,et al.  A comparative survey of Personalised Information Retrieval and Adaptive Hypermedia techniques , 2012, Inf. Process. Manag..

[32]  K. Rayner Eye Movements and Cognitive Processes in Reading, Visual Search, and Scene Perception , 1995 .

[33]  Marilyn Tremaine,et al.  Understanding visualization through spatial ability differences , 2005, VIS 05. IEEE Visualization, 2005..

[34]  Cristina Conati,et al.  Individual user characteristics and information visualization: connecting the dots through eye tracking , 2013, CHI.