Snap clutch, a moded approach to solving the Midas touch problem

This paper proposes a simple approach to an old problem, that of the 'Midas Touch'. This uses modes to enable different types of mouse behavior to be emulated with gaze and by using gestures to switch between these modes. A light weight gesture is also used to switch gaze control off when it is not needed, thereby removing a major cause of the problem. The ideas have been trialed in Second Life, which is characterized by a feature-rich of set of interaction techniques and a 3D graphical world. The use of gaze with this type of virtual community is of great relevance to severely disabled people as it can enable them to be in the community on a similar basis to able-bodied participants. The assumption here though is that this group will use gaze as a single modality and that dwell will be an important selection technique. The Midas Touch Problem needs to be considered in the context of fast dwell-based interaction. The solution proposed here, Snap Clutch, is incorporated into the mouse emulator software. The user trials reported here show this to be a very promising way in dealing with some of the interaction problems that users of these complex interfaces face when using gaze by dwell.

[1]  I. Scott MacKenzie,et al.  Speech-augmented eye gaze interaction with small closely spaced targets , 2006, ETRA.

[2]  Alberto Sangiovanni-Vincentelli,et al.  An embedded system for an eye-detection sensor , 2005, Comput. Vis. Image Underst..

[3]  Katsuro Inoue,et al.  Button selection for general GUIs using eye and hand together , 2000, AVI '00.

[4]  Robert J. K. Jacob,et al.  Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces , 2003 .

[5]  Chris Hand,et al.  A Survey of 3D Interaction Techniques , 1997, Comput. Graph. Forum.

[6]  John R. Anderson,et al.  Intelligent gaze-added interfaces , 2000, CHI.

[7]  Howell O. Istance,et al.  Gaze interaction with virtual on-line communities: levelling the playing field for disabled users , 2010, Universal Access in the Information Society.

[8]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[9]  Howell Istance,et al.  Towards eye based virtual environment interaction for users with high-level motor disabilities , 2005 .

[10]  Simeon Keates,et al.  Countering design exclusion , 2003 .

[11]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[12]  Howell O. Istance,et al.  Zooming interfaces!: enhancing the performance of eye controlled pointing devices , 2002, Assets '02.

[13]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[14]  Aulikki Hyrskykari,et al.  Utilizing eye movements: Overcoming inaccuracy while tracking the focus of attention during reading , 2006, Comput. Hum. Behav..

[15]  Andrew T. Duchowski,et al.  Gaze- vs. hand-based pointing in virtual environments , 2003, CHI Extended Abstracts.

[16]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[17]  Mads Nielsen,et al.  Eye tracking off the shelf , 2004, ETRA.

[18]  Jeff B. Pelz,et al.  Building a lightweight eyetracking headgear , 2004, ETRA.

[19]  Howell O. Istance,et al.  Eye-based Control of Standard GUI Software , 1996, BCS HCI.

[20]  Howell O. Istance,et al.  Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices , 2003, Universal Access in the Information Society.

[21]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[22]  Jrg Nievergelt,et al.  Sites, modes, and trails: Telling the user of an interactive system where he is, what he can do, and how to get to places (excerpt) , 1987 .