Recent developments in eye tracking technology are paving the way for gaze-driven interaction as the primary interaction modality. Despite successful efforts, existing solutions to the "Midas Touch" problem have two inherent issues: 1) lower accuracy, and 2) visual fatigue that are yet to be addressed. In this work we present GAWSCHI: a Gaze-Augmented, Wearable-Supplemented Computer-Human Interaction framework that enables accurate and quick gaze-driven interactions, while being completely immersive and hands-free. GAWSCHI uses an eye tracker and a wearable device (quasi-mouse) that is operated with the user's foot, specifically the big toe. The system was evaluated with a comparative user study involving 30 participants, with each participant performing eleven predefined interaction tasks (on MS Windows 10) using both mouse and gaze-driven interactions. We found that gaze-driven interaction using GAWSCHI is as good (time and precision) as mouse-based interaction as long as the dimensions of the interface element are above a threshold (0.60" x 0.51"). In addition, an analysis of NASA Task Load Index post-study survey showed that the participants experienced low mental, physical, and temporal demand; also achieved a high performance. We foresee GAWSCHI as the primary interaction modality for the physically challenged and a means of enriched interaction modality for the able-bodied demographics.
[1]
Tracy Anne Hammond,et al.
Designing Vibrotactile Codes to Communicate Verb Phrases
,
2014,
TOMM.
[2]
Andrew T. Duchowski,et al.
Eye Tracking Methodology: Theory and Practice
,
2003,
Springer London.
[3]
Andreas Paepcke,et al.
EyePoint: practical pointing and selection using gaze and keyboard
,
2007,
CHI.
[4]
Jason Alexander,et al.
The Feet in Human--Computer Interaction
,
2015,
ACM Comput. Surv..
[5]
Chris Lankford.
Effective eye-gaze input into Windows
,
2000,
ETRA.
[6]
Roope Raisamo,et al.
Appropriateness of foot interaction for non-accurate spatial tasks
,
2004,
CHI EA '04.
[7]
G. Pearson,et al.
Of moles and men: the design of foot controls for workstations
,
1986,
CHI '86.
[8]
Raimund Dachselt,et al.
Gaze-supported foot interaction in zoomable information spaces
,
2013,
CHI Extended Abstracts.