Looking my way through the menu: the impact of menu design and multimodal input on gaze-based menu selection

In this paper a study is reported, which investigates the effectiveness of two approaches to improving gaze-based interaction for realistic and complex menu selection tasks. The first approach focuses on identifying menu designs for hierarchical menus that are particularly suitable for gaze-based interaction, whereas the second approach is based on the idea of combining gaze-based interaction with speech as a second input modality. In an experiment with 40 participants the impact of menu design, input device, and navigation complexity on accuracy and completion time in a menu selection task as well as on user satisfaction were investigated. The results concerning both objective task performance and subjective ratings confirmed our expectations in that a semi-circle menu was better suited for gaze-based menu selection than either a linear or a full-circle menu. Contrary to our expectations, an input device solely based on eye gazes turned out to be superior to the combined gaze- and speech-based device. Moreover, the drawbacks of a less suitable menu design (i.e., of a linear menu or a full-circle menu) as well as of the multimodal input device particularly obstructed performance in the case of more complex navigational tasks.

[1]  Takeo Igarashi,et al.  Considering the direction of cursor movement for efficient traversal of cascading menus , 2003, UIST '03.

[2]  Boris M. Velichkovsky,et al.  Auf dem Weg zur Blickmaus: Die Beeinflussung der Fixationsdauer durch kognitive und kommunikative Aufgaben , 1997, Software-Ergonomie.

[3]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[4]  Takehiko Ohno Features of eye gaze interface for selection tasks , 1998, Proceedings. 3rd Asia Pacific Computer Human Interaction (Cat. No.98EX110).

[5]  Robert J. K. Jacob,et al.  Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces , 2003 .

[6]  Andrew T. Duchowski,et al.  Proceedings of the 2006 symposium on Eye tracking research & applications , 2000 .

[7]  M. Weiser,et al.  An empirical comparison of pie vs. linear menus , 1988, CHI '88.

[8]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[9]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[10]  Wolfgang Beinhauer A widget library for gaze-based interaction elements , 2006, ETRA '06.

[11]  Katsuro Inoue,et al.  Quick Button Selection with Eye Gazing for General GUI Environment , 2000 .

[12]  Manpreet Kaur,et al.  Where is "it"? Event Synchronization in Gaze-Speech Input Systems , 2003, ICMI '03.

[13]  Scott P. Robertson,et al.  Proceedings of the SIGCHI Conference on Human Factors in Computing Systems , 1991 .