Linking search tasks with low-level eye movement patterns

Motivation -- On-the-task detection of the task type and task attributes can benefit personalization and adaptation of information systems. Research approach -- A web-based information search experiment was conducted with 32 participants using a multi-stream logging system. The realistic tasks were related directly to the backgrounds of the participants and were of distinct task types. Findings/Design -- We report on a relationship between task and individual reading behaviour. Specifically we show that transitions between scanning and reading behaviour in eye movement patterns are an implicit indicator of the current task. Research limitations/Implications -- This work suggests it is plausible to infer the type of information task from eye movement patterns. One limitation is a lack of knowledge about the general reading model differences across different types of tasks in the population. Although this is an experimental study we argue it can be generalized to real world text-oriented information search tasks. Originality/Value -- This research presents a new methodology to model user information search task behaviour. It suggests promise for detection of information task type based on patterns of eye movements. Take away message -- With increasingly complex computer interaction, knowledge about the type of information task can be valuable for system personalization. Modelling the reading/scanning patterns of eye movements can allow inference about the task type and task attributes.

[1]  Ryen W. White,et al.  A study on the effects of personalization and task information on implicit feedback performance , 2006, CIKM '06.

[2]  K. Rayner,et al.  Examining the Word Identification Stages Hypothesized by the E-Z Reader Model , 2006, Psychological science.

[3]  Daniel M. Russell,et al.  Discriminating the relevance of web search results with measures of pupil size , 2009, CHI.

[4]  Alexander Pollatsek,et al.  E–Z Reader: A cognitive-control, serial-attention model of eye-movement behavior during reading , 2006, Cognitive Systems Research.

[5]  D. Ballard,et al.  What you see is what you need. , 2003, Journal of vision.

[6]  Keith Rayner,et al.  On the Processing of Meaning from Parafoveal Vision During Eye Fixations in Reading , 2003 .

[7]  Jacek Gwizdka,et al.  A Data Analysis and Modelling Framework for the Evaluation of Interactive Information Retrieval , 2010, ECIR.

[8]  Nicholas J. Belkin,et al.  Some(what) grand challenges for information retrieval , 2008, SIGF.

[9]  J. Findlay,et al.  Active Vision: The Psychology of Looking and Seeing , 2003 .

[10]  Jacek Gwizdka,et al.  A User-Centered Experiment and Logging Framework for Interactive Information Retrieval , 2009, UIIR@SIGIR.

[11]  K. Rayner,et al.  Mindless reading revisited: Eye movements during reading and scanning are different , 1996, Perception & psychophysics.

[12]  George L. Malcolm,et al.  Eye Movements and Visual Encoding During Scene Perception , 2009, Psychological science.

[13]  Erik D. Reichle,et al.  Chapter 11 – Eye Movement Control in Reading: An Overview and Model , 1998 .

[14]  Nicholas J. Belkin,et al.  Display time as implicit feedback: understanding task effects , 2004, SIGIR '04.

[15]  Nicholas J. Belkin,et al.  Personalizing information retrieval for multi-session tasks: the roles of task stage and task type , 2010, SIGIR '10.

[16]  Carrick C. Williams,et al.  Eye movements during information processing tasks: Individual differences and cultural effects , 2007, Vision Research.

[17]  Peter Brusilovsky,et al.  Social Navigation Support for Information Seeking: If You Build It, Will They Come? , 2009, UMAP.

[18]  Edward Cutrell,et al.  An eye tracking study of the effect of target rank on web search , 2007, CHI.

[19]  Yuelin Li,et al.  Exploring the relationships between work task and search task in information search , 2009, J. Assoc. Inf. Sci. Technol..

[20]  Jacek Gwizdka,et al.  SIGIR 2009 workshop on understanding the user: logging and interpreting user interactions in information search and retrieval , 2009, SIGF.

[21]  Andreas Dengel,et al.  Query expansion using gaze-based feedback on the subdocument level , 2008, SIGIR '08.

[22]  Antonio Torralba,et al.  Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. , 2006, Psychological review.

[23]  Kasia Muldner,et al.  Exploring Eye Tracking to Increase Bandwidth in User Modeling , 2005, User Modeling.

[24]  Ling Xia,et al.  Eye tracking and online search: Lessons learned and challenges ahead , 2008, J. Assoc. Inf. Sci. Technol..

[25]  Thorsten Joachims,et al.  Eye-tracking analysis of user behavior in WWW search , 2004, SIGIR '04.

[26]  Noriko Kando,et al.  Differences between informational and transactional tasks in information seeking on the web , 2008, IIiX.

[27]  Jason A. Droll,et al.  Learning where to look , 2007 .

[28]  M. Pickering,et al.  Eye guidance in reading and scene perception , 1998 .

[29]  Kalervo Järvelin,et al.  Task complexity affects information seeking and use , 1995 .

[30]  Andreas Dengel,et al.  Eye movements as implicit relevance feedback , 2008, CHI Extended Abstracts.