Information Retrieval by Inferring Implicit Queries from Eye Movements

We introduce a new search strategy, in which the information retrieval (IR) query is inferred from eye movements measured when the user is reading text during an IR task. In training phase, we know the users' interest, that is, the relevance of training documents. We learn a predictor that produces a "query" given the eye movements; the target of learning is an "optimal" query that is computed based on the known relevance of the training documents. Assuming the predictor is universal with respect to the users' interests, it can also be applied to infer the implicit query when we have no prior knowledge of the users' interests. The result of an empirical study is that it is possible to learn the implicit query from a small set of read documents, such that relevance predictions for a large set of unseen documents are ranked significantly better than by random guessing.

[1]  Samuel Kaski,et al.  Inferring Relevance from Eye Movements: Feature Extraction , 2005 .

[2]  Samuel Kaski,et al.  Combining eye movements and collaborative filtering for proactive information retrieval , 2005, SIGIR '05.

[3]  Michael McGill,et al.  Introduction to Modern Information Retrieval , 1983 .

[4]  Thorsten Joachims,et al.  Accurately Interpreting Clickthrough Data as Implicit Feedback , 2017 .

[5]  John Shawe-Taylor,et al.  Canonical Correlation Analysis: An Overview with Application to Learning Methods , 2004, Neural Computation.

[6]  Samuel Kaski,et al.  Can Relevance be Inferred from Eye Movements in Information Retrieval , 2003 .

[7]  Roman Rosipal,et al.  Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space , 2002, J. Mach. Learn. Res..

[8]  Falk Scholer,et al.  User performance versus precision measures for simple search tasks , 2006, SIGIR.

[9]  Jaime Teevan,et al.  Implicit feedback for inferring user preference: a bibliography , 2003, SIGF.

[10]  Paul P. Maglio,et al.  Attentive agents , 2003, Commun. ACM.

[11]  John Shawe-Taylor,et al.  Two view learning: SVM-2K, Theory and Practice , 2005, NIPS.

[12]  Paul P. Maglio,et al.  SUITOR: an attentive information system , 2000, IUI '00.

[13]  Samuel Kaski,et al.  Implicit Relevance Feedback from Eye Movements , 2005, ICANN.

[14]  Kai Puolamäki,et al.  Proceedings of the NIPS 2005 Workshop on Machine Learning for Implicit Feedback and User Modeling , 2006 .

[15]  J. Shawe-Taylor,et al.  Sparse Feature Extraction using Generalised Partial Least Squares , 2006, 2006 16th IEEE Signal Processing Society Workshop on Machine Learning for Signal Processing.