The King-Kong Effects: Improving sensation of walking in VR with visual and tactile vibrations at each step

In this paper we present novel sensory feedbacks named ”King-Kong Effects” to enhance the sensation of walking in virtual environments. King Kong Effects are inspired by special effects in movies in which the incoming of a gigantic creature is suggested by adding visual vibrations/pulses to the camera at each of its steps. In this paper, we propose to add artificial visual or tactile vibrations (King-Kong Effects or KKE) at each footstep detected (or simulated) during the virtual walk of the user. The user can be seated, and our system proposes to use vibrotactile tiles located under his/her feet for tactile rendering, in addition to the visual display. We have designed different kinds of KKE based on vertical or lateral oscillations, physical or metaphorical patterns, and one or two peaks for heal-toe contacts simulation. We have conducted different experiments to evaluate the preferences of users navigating with or without the various KKE. Taken together, our results identify the best choices for future uses of visual and tactile KKE, and they suggest a preference for multisensory combinations. Our King-Kong effects could be used in a variety of VR applications targeting the immersion of a user walking in a 3D virtual scene.

[1]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[2]  Hiroo Iwata,et al.  CirculaFloor: A Locomotion Interface Using Circulation of Movable Tiles , 2005, VR.

[3]  Jeremy R. Cooperstock,et al.  Interaction capture in immersive virtual environments via an intelligent floor surface , 2010, 2010 IEEE Virtual Reality Conference (VR).

[4]  Christopher L. Vaughan,et al.  Dynamics of human gait , 1992 .

[5]  Anatole Lécuyer,et al.  Using an Eye-Tracking System to Improve Depth-of-Field Blur Effects and Camera Motions in Virtual Environments , 2008 .

[6]  R. McNeill Alexander,et al.  Principles of Animal Locomotion , 2002 .

[7]  Anatole Lécuyer,et al.  A real-time visual attention model for predicting gaze point during first-person exploration of virtual environments , 2010, VRST '10.

[8]  Vincent Hayward,et al.  Audio-tactile Display of Ground Properties Using Interactive Shoes , 2010, HAID.

[9]  Luca Turchet,et al.  Extraction of ground reaction forces for real-time synthesis of walking sounds , 2009 .

[10]  Allison M. Okamura,et al.  Vibration feedback models for virtual environments , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[11]  Stephen McAdams,et al.  A Vibrotactile Device for Display of Virtual Ground Materials in Walking , 2008, EuroHaptics.

[12]  Maud Marchal,et al.  Walking up and down in immersive virtual worlds: Novel interactive techniques based on visual feedback , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[13]  Hiroo Iwata,et al.  Gait Master: a versatile locomotion interface for uneven virtual terrain , 2001, Proceedings IEEE Virtual Reality 2001.

[14]  Luca Turchet,et al.  Sound synthesis and evaluation of interactive footsteps for virtual reality applications , 2010, 2010 IEEE Virtual Reality Conference (VR).

[15]  A. Kuo A simple model of bipedal walking predicts the preferred speed-step length relationship. , 2001, Journal of biomechanical engineering.

[16]  Novacheck,et al.  The biomechanics of running. , 1998, Gait & posture.

[17]  Hiroo Iwata,et al.  Walking about virtual environments on an infinite floor , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[18]  Anatole Lécuyer,et al.  Camera Motions Improve the Sensation of Walking in Virtual Environments , 2006, IEEE Virtual Reality Conference (VR 2006).