GazeButton: enhancing buttons with eye gaze interactions

The button is an element of a user interface to trigger an action, traditionally using click or touch. We introduce GazeButton, a novel concept extending the default button mode with advanced gaze-based interactions. During normal interaction, users can utilise this button as a universal hub for gaze-based UI shortcuts. The advantages are: 1) easy to integrate in existing UIs, 2) complementary, as users choose either gaze or manual interaction, 3) straightforward, as all features are located in one button, and 4) one button to interact with the whole screen. We explore GazeButtons for a custom-made text reading, writing, and editing tool on a multitouch tablet device. For example, this allows the text cursor position to be set as users look at the position and tap on the GazeButton, avoiding costly physical movement. Or, users can simply gaze over a part of the text that should be selected, while holding the GazeButton. We present a design space, specific application examples, and point to future button designs that become highly expressive by unifying the user's visual and manual input.

[1]  Markus Schneider,et al.  Biomechanics of Front and Back-of-Tablet Pointing with Grasping Hands , 2015, Int. J. Mob. Hum. Comput. Interact..

[2]  Kenton O'Hara,et al.  Pre-Touch Sensing for Mobile Interaction , 2016, CHI.

[3]  Dan Odell,et al.  Enabling comfortable thumb interaction in tablet computers: a Windows 8 case study , 2012 .

[4]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[5]  Wendy E. Mackay,et al.  BiTouch and BiPad: designing bimanual interaction for hand-held tablets , 2012, CHI.

[6]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[7]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[8]  Yanxia Zhang,et al.  Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze , 2015, UIST.

[9]  John Paulin Hansen,et al.  Gaze typing in virtual reality: impact of keyboard design, selection method, and motion , 2018, ETRA.

[10]  Ben Shneiderman,et al.  Direct Manipulation: A Step Beyond Programming Languages , 1983, Computer.

[11]  Matthieu B. Trudeau,et al.  Tablet Keyboard Configuration Affects Performance, Discomfort and Task Difficulty for Thumb Typing in a Two-Handed Grip , 2013, PloS one.

[12]  Hans-Werner Gellersen,et al.  Gaze-touch: combining gaze with multi-touch for interaction on the same surface , 2014, UIST.

[13]  Hans-Werner Gellersen,et al.  Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks , 2015, CHI.

[14]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[15]  Anke Huckauf,et al.  Gazing with pEYEs: towards a universal input for various applications , 2008, ETRA.

[16]  Per Ola Kristensson,et al.  Improving two-thumb text entry on touchscreen devices , 2013, CHI.

[17]  Markku Tukiainen,et al.  Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience , 2009 .

[18]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[19]  Mike Y. Chen,et al.  iGrasp: grasp-based adaptive keyboard for mobile devices , 2013, CHI Extended Abstracts.

[20]  Oliver Hohlfeld,et al.  On the Applicability of Computer Vision based Gaze Tracking in Mobile Scenarios , 2015, MobileHCI.

[21]  Giulio Jacucci,et al.  Pointing while Looking Elsewhere: Designing for Varying Degrees of Visual Guidance during Manual Input , 2016, CHI.

[22]  Andreas Bulling,et al.  EyeTab: model-based gaze estimation on unmodified tablet computers , 2014, ETRA.

[23]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[24]  Niels Henze,et al.  Comparing pointing techniques for grasping hands on tablets , 2014, MobileHCI '14.

[25]  Florian Alt,et al.  The past, present, and future of gaze-enabled handheld mobile devices: survey and lessons learned , 2018, MobileHCI.

[26]  Florian Alt,et al.  CueAuth: Comparing Touch, Mid-Air Gestures, and Gaze for Cue-based Authentication on Situated Displays , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[27]  Steffen Staab,et al.  GazeTheKey: Interactive Keys to Integrate Word Predictions for Gaze-based Text Entry , 2017, IUI Companion.

[28]  Andreas Bulling,et al.  Combining gaze with manual interaction to extend physical reach , 2011, PETMEI '11.

[29]  William Buxton,et al.  Thumb + Pen Interaction on Tablets , 2017, CHI.

[30]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[31]  Joanna Bergstrom-Lehtovirta,et al.  Modeling the functional area of the thumb on mobile touchscreen surfaces , 2014, CHI.

[32]  Hans-Werner Gellersen,et al.  Gaze and Touch Interaction on Tablets , 2016, UIST.

[33]  Hans-Werner Gellersen,et al.  Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.

[34]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[35]  Hans-Werner Gellersen,et al.  Cross-device gaze-supported point-to-point content transfer , 2014, ETRA.

[36]  Margrit Betke,et al.  EyeSwipe: Dwell-free Text Entry Using Gaze Paths , 2016, CHI.

[37]  Poika Isokoski,et al.  Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze , 2008, ETRA.