Controlling a Smartphone Using Gaze Gestures as the Input Mechanism

The emergence of small handheld devices such as tablets and smartphones, often with touch sensitive surfaces as their only input modality, has spurred a growing interest in the subject of gestures for human–computer interaction (HCI). It has been proven before that eye movements can be consciously controlled by humans to the extent of performing sequences of predefined movement patterns, or “gaze gestures” that can be used for HCI purposes in desktop computers. Gaze gestures can be tracked noninvasively using a video-based eye-tracking system. We propose here that gaze gestures can also be an effective input paradigm to interact with handheld electronic devices. We show through a pilot user study how gaze gestures can be used to interact with a smartphone, how they are easily assimilated by potential users, and how the Needleman-Wunsch algorithm can effectively discriminate intentional gaze gestures from otherwise typical gaze activity performed during standard interaction with a small smartphone screen. Hence, reliable gaze–smartphone interaction is possible with accuracy rates, depending on the modality of gaze gestures being used (with or without dwell), higher than 80 to 90%, negligible false positive rates, and completion speeds lower than 1 to 1.5 s per gesture. These encouraging results and the low-cost eye-tracking equipment used suggest the possibilities of this new HCI modality for the field of interaction with small-screen handheld devices.

[1]  Albrecht Schmidt,et al.  Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.

[2]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[3]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[4]  Hiroshi Sato,et al.  MobiGaze: development of a gaze interface for handheld mobile devices , 2010, CHI EA '10.

[5]  Howell O. Istance,et al.  Designing gaze gestures for gaming: an investigation of performance , 2010, ETRA.

[6]  Christus,et al.  A General Method Applicable to the Search for Similarities in the Amino Acid Sequence of Two Proteins , 2022 .

[7]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[8]  John Paulin Hansen,et al.  Single stroke gaze gestures , 2009, CHI Extended Abstracts.

[9]  Heiko Drewes,et al.  Eye gaze tracking for human computer interaction , 2010 .

[10]  John Paulin Hansen,et al.  Single gaze gestures , 2010, ETRA '10.

[11]  John Paulin Hansen,et al.  New technological windows into mind: there is more in eyes and brains for human-computer interaction , 1996, CHI.

[12]  W W Abbott,et al.  Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces , 2012, Journal of neural engineering.

[13]  Elisabeth André,et al.  Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze , 2008, PIT.

[14]  Francisco B. Rodríguez,et al.  Optimizing Hierarchical Temporal Memory for Multivariable Time Series , 2010, ICANN.

[15]  Francisco B. Rodríguez,et al.  Gaze Gesture Recognition with Hierarchical Temporal Memory Networks , 2011, IWANN.

[16]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[17]  David J. Ward,et al.  Artificial intelligence: Fast hands-free writing by gaze direction , 2002, Nature.

[18]  Alexander De Luca,et al.  Evaluation of eye-gaze interaction methods for security enhanced PIN-entry , 2007, OZCHI '07.

[19]  Gerhard Tröster,et al.  It’s in Your Eyes - Towards Context-Awareness and Mobile HCI Using Wearable EOG Goggles , 2008 .

[20]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  Heinrich Hußmann,et al.  Eyepass - eye-stroke authentication for public terminals , 2008, CHI Extended Abstracts.

[22]  John Paulin Hansen,et al.  Gaze input for mobile devices by dwell and gestures , 2012, ETRA.

[23]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[24]  Andrew T. Duchowski,et al.  Use of Eye Movement Gestures for Web Browsing , 2005 .

[25]  Francisco B. Rodríguez,et al.  Extending the bioinspired hierarchical temporal memory paradigm for sign language recognition , 2012, Neurocomputing.

[26]  Lalit Gupta,et al.  Gesture-based interaction and communication: automated classification of hand gesture contours , 2001, IEEE Trans. Syst. Man Cybern. Syst..

[27]  Francisco B. Rodríguez,et al.  Low cost remote gaze gesture recognition in real time , 2012, Appl. Soft Comput..

[28]  Jacob O. Wobbrock,et al.  Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures , 2007 .

[29]  Andreas Dengel,et al.  Reading and estimating gaze on smart phones , 2012, ETRA '12.

[30]  Francisco B. Rodríguez,et al.  Gliding and saccadic gaze gesture recognition in real time , 2012, TIIS.