Embodied Interactions with Audio-Tactile Virtual Objects in AHNE

Interactive virtual environments are often focused on visual representation. This study introduces embodied and eyes-free interaction with audio-haptic navigation environment (AHNE) in a 3-dimensional space. AHNE is based on an optical tracking algorithm that makes use of Microsoft-Kinect and virtual objects are presented by dynamic audio-tactile cues. Users are allowed to grab and move the targets, enabled by a sensor located in a glove. To evaluate AHNE with users, an experiment was conducted. Users' comments indicated that sound cues elicited physical and visual experiences. Our findings suggest that AHNE could be a novel and fun interface to everyday resources in the environment such as a home audio system in the living room or a shopping list by fridge.