Wrist-worn pervasive gaze interaction

This paper addresses gaze interaction for smart home control, conducted from a wrist-worn unit. First we asked ten people to enact the gaze movements they would propose for e.g. opening a door or adjusting the room temperature. On basis of their suggestions we built and tested different versions of a prototype applying off-screen stroke input. Command prompts were given to twenty participants by text or arrow displays. The success rate achieved by the end of their first encounter with the system was 46% in average; it took them 1.28 seconds to connect with the system and 1.29 seconds to make a correct selection. Their subjective evaluations were positive with regard to the speed of the interaction. We conclude that gaze gesture input seems feasible for fast and brief remote control of smart home technology provided that robustness of tracking is improved.

[1]  Hans-Werner Gellersen,et al.  Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.

[2]  Alexander De Luca,et al.  Evaluation of eye-gaze interaction methods for security enhanced PIN-entry , 2007, OZCHI '07.

[3]  Mark J. Burge,et al.  Handbook of Iris Recognition , 2013, Advances in Computer Vision and Pattern Recognition.

[4]  K. Bowyer,et al.  Handbook of Iris Recognition , 2016, Advances in Computer Vision and Pattern Recognition.

[5]  John Paulin Hansen,et al.  Gaze input for mobile devices by dwell and gestures , 2012, ETRA.

[6]  Roope Raisamo,et al.  Glance Awareness and Gaze Interaction in Smartwatches , 2015, CHI Extended Abstracts.

[7]  Gilbert Cockton,et al.  CHI '03 Extended Abstracts on Human Factors in Computing Systems , 2003, CHI 2003.

[8]  Jeffrey S. Shell,et al.  EyePliances: attention-seeking devices that respond to visual attention , 2003, CHI Extended Abstracts.

[9]  John Paulin Hansen,et al.  Eye Movements in Gaze Interaction , 2013 .

[10]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[11]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[12]  Thierry Baccino,et al.  Rapid serial visual presentation in reading: The case of Spritz , 2015, Comput. Hum. Behav..

[13]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[14]  Oleg V. Komogortsev,et al.  Complex eye movement pattern biometrics: Analyzing fixations and saccades , 2013, 2013 International Conference on Biometrics (ICB).

[15]  Blaine A. Price,et al.  Wearables: has the age of smartwatches finally arrived? , 2015, Commun. ACM.

[16]  John Daugman,et al.  High Confidence Visual Recognition of Persons by a Test of Statistical Independence , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  John Paulin Hansen,et al.  A GazeWatch Prototype , 2015, MobileHCI Adjunct.

[18]  Roope Raisamo,et al.  Gaze gestures and haptic feedback in mobile devices , 2014, CHI.

[19]  Pablo Varona,et al.  Controlling a Smartphone Using Gaze Gestures as the Input Mechanism , 2015, Hum. Comput. Interact..

[20]  Sebastian Boring,et al.  Gradual engagement: facilitating information exchange between digital devices as a function of proximity , 2012, ITS.

[21]  John Paulin Hansen,et al.  A gaze interactive textual smartwatch interface , 2015, UbiComp/ISWC Adjunct.