Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses

Gaze gestures bear potential for user input with mobile devices, especially smart glasses, due to being always available and hands-free. So far, gaze gesture recognition approaches have utilized open-eye movements only and disregarded closed-eye movements. This paper is a first investigation of the feasibility of detecting and recognizing closed-eye gaze gestures from close-up optical sources, e.g. eye-facing cameras embedded in smart glasses. We propose four different closed-eye gaze gesture protocols, which extend the alphabet of existing open-eye gaze gesture approaches. We further propose a methodology for detecting and extracting the corresponding closed-eye movements with full optical flow, time series processing, and machine learning. In the evaluation of the four protocols we find closed-eye gaze gestures to be detected 82.8%–91.6% of the time, and extracted gestures to be recognized correctly with an accuracy of 92.9%–99.2%.

[1]  Kai Kunze,et al.  Smart Eyewear for Interaction and Activity Recognition , 2015, CHI Extended Abstracts.

[2]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[3]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Pierre Dillenbourg,et al.  This is it ! : Indicating and looking in collaborative work at distance , 2010 .

[5]  Albrecht Schmidt,et al.  Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.

[6]  Kari-Jouko Räihä,et al.  Speed and Accuracy of Gaze Gestures , 2009 .

[7]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[8]  Howell O. Istance,et al.  Designing gaze gestures for gaming: an investigation of performance , 2010, ETRA.

[9]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[10]  Gerhard Tröster,et al.  Wearable EOG goggles: eye-based interaction in everyday environments , 2009, CHI Extended Abstracts.

[11]  Howell O. Istance,et al.  Gaze gestures or dwell-based interaction? , 2012, ETRA '12.

[12]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[13]  Eog Goggles It's in Your Eyes-Towards Context-Awareness and Mobile HCI Using Wearable EOG Goggles , 2008 .

[14]  René Mayrhofer,et al.  A Large-Scale, Long-Term Analysis of Mobile Device Usage Characteristics , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..