HMD Light: Sharing In-VR Experience via Head-Mounted Projector for Asymmetric Interaction

We present HMD Light, a proof-of-concept Head-Mounted Display (HMD) implementation that reveals the Virtual Reality (VR) user's experience in the physical environment to facilitate communication between VR and external users in a mobile VR context. While previous work externalized the VR user's experience through an on-HMD display, HMD Light places the display into the physical environment to enable larger display and interaction area. This work explores the interaction design space of HMD Light and presents four applications to demonstrate its versatility. Our exploratory user study observed participant pairs experience applications with HMD Light and evaluated usability, accessibility and social presence between users. From the results, we distill design insights for HMD Light and asymmetric VR collaboration.

[1]  Bing-Yu Chen,et al.  Outside-In: Visualizing Out-of-Sight Regions-of-Interest in a 360° Video Using Spatial Picture-in-Picture Previews , 2017, UIST.

[2]  Eyal Ofek,et al.  DreamWalker: Substituting Real-World Walking Experiences with a Virtual Reality , 2019, UIST.

[3]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[4]  Eyal Ofek,et al.  VRoamer: Generating On-The-Fly VR Experiences While Walking inside Large, Unknown Real-World Building Environments , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[5]  Eyal Ofek,et al.  RealityCheck: Blending Virtual Environments with Situated Physical Reality , 2019, CHI.

[6]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[7]  Enrico Rukzio,et al.  Face/On: Actuating the Facial Contact Area of a Head-Mounted Display for Increased Immersion , 2018, UIST.

[8]  Daniel Kade,et al.  Head-mounted mixed reality projection display for games production and entertainment , 2015, Personal and Ubiquitous Computing.

[9]  Roland Wagner,et al.  A combined immersive and desktop authoring tool for virtual environments , 2002, Proceedings IEEE Virtual Reality 2002.

[10]  Yun Zhou,et al.  Interaction on-the-go: a fine-grained exploration on wearable PROCAM interfaces and gestures in mobile situations , 2016, Universal Access in the Information Society.

[11]  Eyal Ofek,et al.  FoveAR: Combining an Optically See-Through Near-Eye Display with Projector-Based Spatial Augmented Reality , 2015, UIST.

[12]  Enrico Rukzio,et al.  Pervasive information through constant personal projection: the ambient mobile pervasive display (AMP-D) , 2014, CHI.

[13]  Liwei Chan,et al.  FaceWidgets: Exploring Tangible Interaction on Face with Head-Mounted Displays , 2019, UIST.

[14]  Julian Frommel,et al.  ShareVR: Enabling Co-Located Experiences for Virtual Reality between HMD and Non-HMD Users , 2017, CHI.

[15]  Mark Billinghurst,et al.  Evaluation of mixed-space collaboration , 2005, Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'05).

[16]  Kouta Minamizawa,et al.  FrontFace: facilitating communication between HMD users and outsiders using front-facing-screen HMDs , 2017, MobileHCI.

[17]  Woontack Woo,et al.  Evaluating the Combination of Visual Communication Cues for HMD-based Mixed Reality Remote Collaboration , 2019, CHI.

[18]  Matthew Chalmers,et al.  Lessons from the lighthouse: collaboration in a shared mixed reality system , 2003, CHI '03.

[19]  Takashi Komuro,et al.  AR Tabletop Interface using a Head-Mounted Projector , 2016, 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct).

[20]  Bruce H. Thomas,et al.  Implementation of god-like interaction techniques for supporting collaboration between outdoor AR and indoor tabletop users , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[21]  Mitsunori Tada,et al.  Dollhouse VR: a multi-view, multi-user collaborative design workspace with VR technology , 2015, SIGGRAPH Asia Posters.

[22]  Panagiotis Michalatos,et al.  MagicTorch: A Context-aware Projection System for Asymmetrical VR Games , 2017, CHI PLAY.

[23]  Roshan Lalintha Peiris,et al.  FacePush: Introducing Normal Force on Face with Head-Mounted Displays , 2018, UIST.

[24]  Blair MacIntyre,et al.  RoomAlive: magical experiences enabled by scalable, adaptive projector-camera units , 2014, UIST.

[25]  Fuat Balci,et al.  Head Mounted Projection Display & Visual Attention: Visual Attentional Processing of Head Referenced Static and Dynamic Displays while in Motion and Standing , 2016, CHI.

[26]  Bruce H. Thomas,et al.  On the Shoulder of the Giant: A Multi-Scale Mixed Reality Collaboration with 360 Video Sharing and Tangible Interaction , 2019, CHI.

[27]  Enrico Rukzio,et al.  FaceTouch: Enabling Touch Interaction in Display Fixed UIs for Mobile Virtual Reality , 2016, UIST.

[28]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[29]  Ivan Poupyrev,et al.  The MagicBook - Moving Seamlessly between Reality and Virtuality , 2001, IEEE Computer Graphics and Applications.

[30]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[31]  Pattie Maes,et al.  GyroVR: Simulating Inertia in Virtual Reality using Head Worn Flywheels , 2016, UIST.

[32]  Liwei Chan,et al.  HapticSphere: Physical Support To Enable Precision Touch Interaction in Mobile Mixed-Reality , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[33]  Wei Peng,et al.  ThermoVR: Exploring Integrated Thermal Haptic Feedback with Head Mounted Displays , 2017, CHI.

[34]  Vivek K. Goyal,et al.  Mime: compact, low power 3D gesture sensing for interaction with head mounted displays , 2013, UIST.

[35]  Robert Dormer Keep Talking and Nobody Explodes , 2017 .

[36]  Akira Ishii,et al.  Let Your World Open: CAVE-based Visualization Methods of Public Virtual Reality towards a Shareable VR Experience , 2019, AH.

[37]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[38]  F. Biocca,et al.  Internal Consistency and Reliability of the Networked MindsMeasure of Social Presence , 2004 .

[39]  Daniel Pohl,et al.  See what I see: Concepts to improve the social acceptance of HMDs , 2016, 2016 IEEE Virtual Reality (VR).

[40]  Liwei Chan,et al.  ShareSpace: Facilitating Shared Use of the Physical Space by both VR Head-Mounted Display and External Users , 2018, UIST.

[41]  Mohamed Khamis,et al.  TransparentHMD: revealing the HMD user's face to bystanders , 2017, MUM.

[42]  Enrico Rukzio,et al.  EyeVR: low-cost VR eye-based interaction , 2016, UbiComp Adjunct.

[43]  Harpreet Sareen,et al.  FaceDisplay: Towards Asymmetric Multi-User Interaction for Nomadic Virtual Reality , 2018, CHI.

[44]  Robert W. Lindeman,et al.  Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration , 2018, CHI.

[45]  Heinrich Hußmann,et al.  Frontal Screens on Head-Mounted Displays to Increase Awareness of the HMD Users' State in Mixed Presence Collaboration , 2019, ArXiv.

[46]  Roshan Lalintha Peiris,et al.  A Skin-Stroke Display on the Eye-Ring Through Head-Mounted Displays , 2020, CHI.

[47]  J.D. Bayliss,et al.  Use of the evoked potential P3 component for control in a virtual apartment , 2003, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[48]  Bing-Yu Chen,et al.  ElastImpact: 2.5D Multilevel Instant Impact Using Elasticity on Head-Mounted Displays , 2019, UIST.

[49]  Daniel C. McFarlane,et al.  Interactive dirt: increasing mobile work performance with a wearable projector-camera system , 2009, UbiComp.