Using gaze gestures with haptic feedback on glasses

Wearable computing devices are gradually becoming common, and head-mounted devices such as Google Glass are already available. These devices present new interaction challenges as the devices are usually small in size, and also the usage environment sets limitations on the available interaction modalities. One potential interaction method could be to use gaze for input and haptics for output with a head-worn device. We built a demonstration system to show how gaze gestures could be used to control a simple information application together with head area haptic feedback for gesture confirmation. The demonstration and experiences of early user studies have shown that users perceive such an input-output combination useful.