We have developed an interactive system that allows untethered users to experience walking on virtual ground surfaces resembling natural materials. The demonstration consists of a multimodal floor interface for providing auditory, tactile and visual feedback to users' steps. It is intended for immersive virtual and augmented reality environments (VE) that provide the impression of walking over natural ground surfaces, such as snow and ice. To date, immersive environments with interactive floor surfaces have been largely focused on visual and auditory feedback linked to a VE simulation (e.g., [Gronbaek 2007]; see also the comparative review in [Miranda and Wanderley 2006]). However, while walking in natural environments, we receive continuous, multisensory information about the nature of the ground we walk on -- the crush of dry leaves, the soft compression of grass. The static nature of floor surfaces in existing VEs typically bears little resemblance to a given natural ground material. This creates a perceptual conflict with the dynamic visual and/or auditory feedback that users are provided in the VE. This project illustrates a novel approach to reconciling such perceptual conflicts, based on multisensory feedback provided through a floor surface in response to users' steps.
[1]
Federico Fontana,et al.
Physics-based sound synthesis and control: crushing, walking and running by crumpling sounds
,
2003
.
[2]
Kaj Grønbæk,et al.
IGameFloor: a platform for co-located collaborative games
,
2007,
ACE '07.
[3]
Marcelo M. Wanderley,et al.
New Digital Musical Instruments: Control And Interaction Beyond the Keyboard (Computer Music and Digital Audio Series)
,
2006
.
[4]
A.W. Law,et al.
A multi-modal floor-space for experiencing material deformation underfoot in virtual reality
,
2008,
2008 IEEE International Workshop on Haptic Audio visual Environments and Games.