MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user’s skeleton in realtime and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a fullbody avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user’s sense immersion in VR.

[1]  Stefania Serafin,et al.  The Perceived Naturalness of Virtual Locomotion Methods Devoid of Explicit Leg Movements , 2013, MIG.

[2]  三嶋 博之 The theory of affordances , 2008 .

[3]  Naotaka Fujii,et al.  Substitutional Reality System: A Novel Experimental Platform for Experiencing Alternative Reality , 2012, Scientific Reports.

[4]  Thijs Roumen,et al.  TurkDeck: Physical Virtual Reality Based on People , 2015, UIST.

[5]  Frederick P. Brooks What's Real About Virtual Reality? , 1999, IEEE Computer Graphics and Applications.

[6]  Mary C. Whitton,et al.  Walking > walking-in-place > flying, in virtual environments , 1999, SIGGRAPH.

[7]  Marc Levoy,et al.  The digital Michelangelo project: 3D scanning of large statues , 2000, SIGGRAPH.

[8]  Rune Klevjer,et al.  Enter the Avatar: The Phenomenology of Prosthetic Telepresence in Computer Games , 2012 .

[9]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[10]  Klaus H. Hinrichs,et al.  Taxonomy and Implementation of Redirection Techniques for Ubiquitous Passive Haptic Feedback , 2008, 2008 International Conference on Cyberworlds.

[11]  Steven K. Feiner,et al.  Opportunistic controls: leveraging natural affordances as tangible user interfaces for augmented reality , 2008, VRST '08.

[12]  Jan-Michael Frahm,et al.  Building Rome on a Cloudless Day , 2010, ECCV.

[13]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[14]  Tecnología Reality–virtuality Continuum , 2011 .

[15]  Mark T. Bolas,et al.  Making small spaces feel large: infinite walking in virtual reality , 2015, SIGGRAPH Emerging Technologies.

[16]  Jonathan Steuer,et al.  Defining virtual reality: dimensions determining telepresence , 1992 .

[17]  Eric Burns,et al.  Combining passive haptics with redirected walking , 2005, ICAT '05.

[18]  Mary C. Whitton,et al.  Passive haptics significantly enhances virtual environments , 2001 .

[19]  Michael M. Kazhdan,et al.  Poisson surface reconstruction , 2006, SGP '06.

[20]  Grigore C. Burdea,et al.  The Rutgers Master II-new design force-feedback glove , 2002 .

[21]  Sebastian Thrun,et al.  Robotic mapping: a survey , 2003 .

[22]  Orit Shaer,et al.  Reality-based interaction: a framework for post-WIMP interfaces , 2008, CHI.

[23]  Hiroo Iwata,et al.  CirculaFloor [locomotion interface] , 2005, IEEE Computer Graphics and Applications.

[24]  Maurice Merleau-Ponty Phenomenology of Perception , 1964 .

[25]  S. Weghorst,et al.  Virtual reality and tactile augmentation in the treatment of spider phobia: a case report. , 1997, Behaviour research and therapy.

[26]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[27]  P BrooksFrederick What's Real About Virtual Reality? , 1999 .

[28]  Greg Welch,et al.  The HiBall Tracker: high-performance wide-area tracking for virtual and augmented environments , 1999, VRST '99.

[29]  Kay M. Stanney,et al.  Postural instability induced by virtual reality exposure: Development of a certification protocol , 1996, Int. J. Hum. Comput. Interact..

[30]  Timothy P. McNamara,et al.  Exploring large virtual environments with an HMD when physical space is limited , 2007, APGV.

[31]  Mel Slater,et al.  Measuring Presence: A Response to the Witmer and Singer Presence Questionnaire , 1999, Presence.

[32]  Ivan Poupyrev,et al.  Virtual Notepad: handwriting in immersive VR , 1998, Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).

[33]  Hallvard J. Fossheim,et al.  The Philosophy of Computer Games , 2012 .

[34]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[35]  Kathy Ryall,et al.  ACM SIGGRAPH 2007 emerging technologies , 2007, SIGGRAPH 2007.

[36]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[37]  Jan-Michael Frahm,et al.  Real-Time Visibility-Based Fusion of Depth Maps , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[38]  Takeo Kanade,et al.  Virtualized reality: concepts and early results , 1995, Proceedings IEEE Workshop on Representation of Visual Scenes (In Conjunction with ICCV'95).

[39]  Alexandru Dancu,et al.  The Ultimate Display , 2014 .

[40]  Robert W. Lindeman,et al.  Towards usable VR: an empirical study of user interfaces for immersive virtual environments , 1999, CHI '99.

[41]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, CHI '94.

[42]  D. Stewart,et al.  A Platform with Six Degrees of Freedom , 1965 .

[43]  Hiroo Iwata,et al.  Powered shoes , 2006, SIGGRAPH '06.

[44]  Sharif Razzaque,et al.  Redirected Walking , 2001, Eurographics.

[45]  Rudy Darken,et al.  The omni-directional treadmill: a locomotion device for virtual worlds , 1997, UIST '97.

[46]  James J. Gibson,et al.  The Ecological Approach to Visual Perception: Classic Edition , 2014 .

[47]  Mel Slater,et al.  Taking steps: the influence of a walking technique on presence in virtual reality , 1995, TCHI.

[48]  Hunter G. Hoffman,et al.  Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments , 1998, Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).

[49]  Hiroo Iwata,et al.  Walking about virtual environments on an infinite floor , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[50]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[51]  James L. McClelland,et al.  Phenomenology of perception. , 1978, Science.

[52]  Kay M. Stanney,et al.  Handbook of Virtual Environments - Design, Implementation, and Applications, Second Edition , 2014, Handbook of Virtual Environments, 2nd ed..

[53]  William Gobson,et al.  Multimedia: From Wagner to Virtual Reality , 2001 .

[54]  Gerhard Lakemeyer,et al.  Exploring artificial intelligence in the new millennium , 2003 .

[55]  Frederick P. Brooks,et al.  Walkthrough—a dynamic graphics system for simulating virtual buildings , 1987, I3D '86.

[56]  Ivan E. Sutherland,et al.  A head-mounted three dimensional display , 1968, AFIPS Fall Joint Computing Conference.

[57]  Hiroo Iwata,et al.  String walker , 2007, SIGGRAPH '07.

[58]  Durand R. Begault,et al.  3-D Sound for Virtual Reality and Multimedia Cambridge , 1994 .