Learning relevance from natural eye movements in pervasive interfaces

We study the feasibility of the following idea: Could a system learn to use the user's natural eye movements to infer relevance of real-world objects, if the user produced a set of learning data by clicking a "relevance" button during a learning session? If the answer is yes, the combination of eye tracking and machine learning would give a basis of "natural" interaction with the system by normally looking around, which would be very useful in mobile proactive setups. We measured the eye movements of the users while they were exploring an artificial art gallery. They labeled the relevant paintings by clicking a button while looking at them. The results show that a Gaussian process classifier accompanied by a time series kernel on the eye movements within an object predicts whether that object is relevant with better accuracy than dwell-time thresholding and random guessing.

[1]  Susanna Nilsson,et al.  Hands Free Interaction with Virtual Information in a Real Environment , 2007 .

[2]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[3]  Jun Rekimoto,et al.  Aided eyes: eye activity sensing for daily life , 2010, AH.

[4]  Steven K. Feiner,et al.  A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[5]  Samuel Kaski,et al.  Inferring object relevance from gaze in dynamic scenes , 2010, ETRA.

[6]  John Shawe-Taylor,et al.  Information Retrieval by Inferring Implicit Queries from Eye Movements , 2007, AISTATS.

[7]  Christopher K. I. Williams,et al.  Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning) , 2005 .

[8]  Jong-Soo Choi,et al.  Design and implementation of an augmented reality system using gaze interaction , 2011, Multimedia Tools and Applications.

[9]  Jacob O. Wobbrock,et al.  Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures , 2007 .

[10]  Anke Huckauf,et al.  Object selection in gaze controlled systems: What you don't look at is what you get , 2011, TAP.

[11]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[12]  Carl E. Rasmussen,et al.  Gaussian Processes for Machine Learning (GPML) Toolbox , 2010, J. Mach. Learn. Res..

[13]  M. Just,et al.  Eye fixations and cognitive processes , 1976, Cognitive Psychology.

[14]  Raimund Dachselt,et al.  Investigating gaze-supported multimodal pan and zoom , 2012, ETRA '12.

[15]  Shigeki Sagayama,et al.  Dynamic Time-Alignment Kernel in Support Vector Machine , 2001, NIPS.

[16]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[17]  Laura Chamberlain Eye Tracking Methodology; Theory and Practice , 2007 .

[18]  Samuel Kaski,et al.  GaZIR: gaze-based zooming interface for image retrieval , 2009, ICMI-MLMI '09.

[19]  Fred Stentiford,et al.  Perceptual image retrieval using eye movements , 2007, Int. J. Comput. Math..

[20]  Jong-Soo Choi,et al.  Wearable augmented reality system using gaze interaction , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[21]  David Barber,et al.  Bayesian Classification With Gaussian Processes , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[23]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[24]  K. O’Hara,et al.  Lifelogging: Privacy and empowerment with memories for life , 2008 .

[25]  David L. Tennenhouse,et al.  Proactive computing , 2000, Commun. ACM.

[26]  Terry Winograd,et al.  Gaze-enhanced scrolling techniques , 2007, UIST.

[27]  Päivi Majaranta,et al.  Proactive Response to Eye Movements , 2003, INTERACT.

[28]  Raimund Dachselt,et al.  Advanced gaze visualizations for three-dimensional virtual environments , 2010, ETRA.

[29]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[30]  Jennifer Healey,et al.  Augmented Reality through Wearable Computing , 1997, Presence: Teleoperators & Virtual Environments.

[31]  Dieter Schmalstieg,et al.  The World as a User Interface: Augmented Reality for Ubiquitous Computing , 2007, Location Based Services and TeleCartography.

[32]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[33]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[34]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[35]  Springer-Verlag London Limited An augmented reality interface to contextual information , 2011 .

[36]  Peter Fröhlich,et al.  KIBITZER: a wearable system for eye-gaze-based mobile urban exploration , 2010, AH.

[37]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[38]  E. Hess,et al.  Pupil Size in Relation to Mental Activity during Simple Problem-Solving , 1964, Science.