Can eyes reveal interest? Implicit queries from gaze patterns

We study a new research problem, where an implicit information retrieval query is inferred from eye movements measured when the user is reading, and used to retrieve new documents. In the training phase, the user’s interest is known, and we learn a mapping from how the user looks at a term to the role of the term in the implicit query. Assuming the mapping is universal, that is, the same for all queries in a given domain, we can use it to construct queries even for new topics for which no learning data is available. We constructed a controlled experimental setting to show that when the system has no prior information as to what the user is searching, the eye movements help significantly in the search. This is the case in a proactive search, for instance, where the system monitors the reading behaviour of the user in a new topic. In contrast, during a search or reading session where the set of inspected documents is biased towards being relevant, a stronger strategy is to search for content-wise similar documents than to use the eye movements.

[1]  J. Shawe-Taylor,et al.  Sparse Feature Extraction using Generalised Partial Least Squares , 2006, 2006 16th IEEE Signal Processing Society Workshop on Machine Learning for Signal Processing.

[2]  David J. Ward,et al.  Artificial intelligence: Fast hands-free writing by gaze direction , 2002, Nature.

[3]  John Shawe-Taylor,et al.  Generic object recognition by combining distinct features in machine learning , 2005, IS&T/SPIE Electronic Imaging.

[4]  Falk Scholer,et al.  User performance versus precision measures for simple search tasks , 2006, SIGIR.

[5]  Samuel Kaski,et al.  Inferring Relevance from Eye Movements: Feature Extraction , 2005 .

[6]  Samuel Kaski,et al.  Combining eye movements and collaborative filtering for proactive information retrieval , 2005, SIGIR '05.

[7]  Michael McGill,et al.  Introduction to Modern Information Retrieval , 1983 .

[8]  Diane Kelly,et al.  Implicit feedback for inferring user preference , 2003 .

[9]  J F Juola,et al.  Reading Moving Text on a CRT Screen , 1984, Human factors.

[10]  Susan T. Dumais,et al.  Implicit queries (IQ) for contextualized search , 2004, SIGIR '04.

[11]  Laurel King,et al.  The relationship between scene and eye movements , 2002, Proceedings of the 35th Annual Hawaii International Conference on System Sciences.

[12]  Kai Puolamäki,et al.  Proceedings of the NIPS 2005 Workshop on Machine Learning for Implicit Feedback and User Modeling , 2006 .

[13]  John Shawe-Taylor,et al.  Information Retrieval by Inferring Implicit Queries from Eye Movements , 2007, AISTATS.

[14]  Samuel Kaski,et al.  Can Relevance be Inferred from Eye Movements in Information Retrieval , 2003 .

[15]  Cristina Conati,et al.  Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation , 2007, Knowl. Based Syst..

[16]  Paul P. Maglio,et al.  Attentive agents , 2003, Commun. ACM.

[17]  John Shawe-Taylor,et al.  Two view learning: SVM-2K, Theory and Practice , 2005, NIPS.

[18]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[19]  Paul P. Maglio,et al.  SUITOR: an attentive information system , 2000, IUI '00.

[20]  John Shawe-Taylor,et al.  Canonical Correlation Analysis: An Overview with Application to Learning Methods , 2004, Neural Computation.

[21]  Martha E. Crosby,et al.  Snapshots from the Eye: Toward Strategies for Viewing Bibliographic Citations , 1993, Interacción.

[22]  Jarkko Salojärvi,et al.  Inferring relevance from eye movements with wrong models , 2008 .

[23]  A. Lévy-Schoen,et al.  The Control of Eye Movements in Reading (Tutorial Paper) , 1979 .

[24]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[25]  Mary Czerwinski,et al.  Visualizing implicit queries for information management and retrieval , 1999, CHI '99.

[26]  Roman Rosipal,et al.  Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space , 2002, J. Mach. Learn. Res..

[27]  Samuel Kaski,et al.  Implicit Relevance Feedback from Eye Movements , 2005, ICANN.

[28]  Daniel Gooch,et al.  Communications of the ACM , 2011, XRDS.

[29]  Thorsten Joachims,et al.  Accurately Interpreting Clickthrough Data as Implicit Feedback , 2017 .

[30]  Barry Smyth,et al.  Passive Profiling from Server Logs in an Online Recruitment Environment , 2001, IJCAI 2001.

[31]  Jaime Teevan,et al.  Implicit feedback for inferring user preference: a bibliography , 2003, SIGF.

[32]  Mark Claypool,et al.  Implicit interest indicators , 2001, IUI '01.

[33]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[34]  Steve Fox,et al.  Evaluating implicit measures to improve web search , 2005, TOIS.

[35]  Kristian J. Hammond,et al.  User interactions with everyday applications as context for just-in-time information access , 2000, IUI '00.

[36]  Paul A. Kolers,et al.  Processing of visible language , 1979 .

[37]  B. Efron,et al.  Bootstrap confidence intervals , 1996 .