Gaze-Based Human-SmartHome-Interaction by Augmented Reality Controls

The use of eye tracking systems enables people with motor disabilities to interact with computers and thus with their environment. Combined with an optical see-through head-mounted display (OST-HMD) it allows the interaction with virtual objects which are attached to real objects respectively actions which can be performed in the SmartHome environment. This means a user can trigger actions of real SmartHome actuators by gazing on the virtual objects in the OST-HMD. In this paper we propose a mobile system which is a combination of a low cost commercial eye tracker and a commercial OST-HMD. The system is intended for a SmartHome application. For this purpose we proof our concept by controlling a LED strip light using gaze-based augmented reality controls. We show a calibration procedure of the OST-HMD and evaluate the influence of the OST-HMD to the accuracy of the eye tracking.