Torso versus gaze direction to navigate a VE by walking in place

In this work, we present a simple method of "walking in place" (WIP) using the Microsoft Kinect to explore a virtual environment (VE) with a head-mounted display (HMD). Other studies have shown that WIP to explore a VE is equivalent to normal walking in terms of spatial orientation. This suggests that WIP is a promising way to explore a large VE. The Microsoft Kinect sensor is a great tool for implementing WIP because it enables real time skeletal tracking and is relatively inexpensive (150 USD). However, the skeletal information obtained from Kinect sensors can be noisy. Thus, in this work, we discuss how we combined the data from two Kinects to implement a robust WIP algorithm. As part of our analysis on how best to implement WIP with the Kinect, we compare Gaze direction locomotion to Torso direction locomotion. We report that participants' spatial orientation was better when they translated forward in the VE in the direction they were looking.

[1]  Timothy P. McNamara,et al.  Updating orientation in large virtual environments using scaled translational gain , 2006, APGV '06.

[2]  Patrick Péruch,et al.  Transfer of Spatial Knowledge from Virtual to Real Environments , 2000, Spatial Cognition.

[3]  John M. Hollerbach,et al.  Effect of Turning Strategy on Maneuvering Ability Using the Treadport Locomotion Interface , 2002, Presence: Teleoperators & Virtual Environments.

[4]  Betsy Williams Sanders,et al.  Evaluation of walking in place on a Wii balance board to explore a virtual environment , 2011, TAP.

[5]  Patrick Péruch,et al.  Spatial Orientation in Virtual Environments: Background Considerations and Experiments , 1998, Spatial Cognition.

[6]  John J. Rieser,et al.  Using Locomotion to Update Spatial Orientation , 2007 .

[7]  Mary C. Whitton,et al.  GUD WIP: Gait-Understanding-Driven Walking-In-Place , 2010, 2010 IEEE Virtual Reality Conference (VR).

[8]  Jodie M Plumert,et al.  Children's perception of gap affordances: bicycling across traffic-filled intersections in an immersive virtual environment. , 2004, Child development.

[9]  Simon Lessels,et al.  For Efficient Navigational Search, Humans Require Full Physical Movement, but Not a Rich Visual Scene , 2006, Psychological science.

[10]  Victoria Interrante,et al.  Seven League Boots: A New Metaphor for Augmented Locomotion through Moderately Large Scale Immersive Virtual Environments , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[11]  Sharif Razzaque,et al.  Redirected Walking , 2001, Eurographics.

[12]  Mel Slater,et al.  Taking steps: the influence of a walking technique on presence in virtual reality , 1995, TCHI.

[13]  Mary C. Whitton,et al.  LLCM-WIP: Low-Latency, Continuous-Motion Walking-in-Place , 2008, 2008 IEEE Symposium on 3D User Interfaces.

[14]  Mary C. Whitton,et al.  Walking > walking-in-place > flying, in virtual environments , 1999, SIGGRAPH.

[15]  Patricia S. Denbrook,et al.  Virtual Locomotion: Walking in Place through Virtual Environments , 1999, Presence.

[16]  Heinrich H. Bülthoff,et al.  Walking improves your cognitive map in environments that are large-scale and large in extent , 2011, TCHI.

[17]  Timothy P. McNamara,et al.  Do We Need to Walk for Effective Virtual Reality Navigation? Physical Rotations Alone May Suffice , 2010, Spatial Cognition.