Toward Mobile Eye-Based Human-Computer Interaction

Current research on eye-based interfaces mostly focuses on stationary settings. However, advances in mobile eye-tracking equipment and automated eye-movement analysis now allow for investigating eye movements during natural behavior and promise to bring eye-based interaction into people's everyday lives. Recent developments in mobile eye tracking equipment point the way toward unobtrusive human-computer interfaces that will become pervasively usable in everyday life. The potential applications for the further capability to track and analyze eye movements anywhere and anytime calls for new research to develop and understand eye-based interaction in mobile daily life settings.

[1]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[2]  Emiliano Miluzzo,et al.  EyePhone: activating mobile phones with your eyes , 2010, MobiHeld '10.

[3]  D. Ballard,et al.  Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.

[4]  Roel Vertegaal,et al.  eyeLook: using attention to facilitate mobile media consumption , 2005, UIST '05.

[5]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[6]  Masaaki Fukumoto,et al.  Full-time wearable headphone-type gaze detector , 2006, CHI Extended Abstracts.

[7]  Roel Vertegaal,et al.  Designing attentive cell phone using wearable eyecontact sensors , 2002, CHI Extended Abstracts.

[8]  Andreas Dengel,et al.  Text 2.0 , 2010, CHI EA '10.

[9]  Gerhard Tröster,et al.  Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography , 2009, Pervasive.

[10]  Jeff B. Pelz,et al.  Portable eyetracking: a study of natural eye movements , 2000, Electronic Imaging.

[11]  Andrea Lockerd Thomaz,et al.  Eye-R, a glasses-mounted eye motion detection interface , 2001, CHI Extended Abstracts.

[12]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[13]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Gerhard Tröster,et al.  Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments , 2009, J. Ambient Intell. Smart Environ..

[15]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[16]  Oleg V. Komogortsev,et al.  Real-time eye gaze tracking with an unmodified commodity webcam employing a neural network , 2010, CHI Extended Abstracts.

[17]  Gerhard Tröster,et al.  What's in the Eyes for Context-Awareness? , 2011, IEEE Pervasive Computing.

[18]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[19]  Hiroshi Sato,et al.  MobiGaze: development of a gaze interface for handheld mobile devices , 2010, CHI EA '10.

[20]  Jun Rekimoto,et al.  Aided eyes: eye activity sensing for daily life , 2010, AH.