PhyShare: Sharing Physical Interaction in Virtual Reality

We present PhyShare, a new haptic user interface based on actuated robots. Virtual reality has recently been gaining wide adoption, and an effective haptic feedback in these scenarios can strongly support user's sensory in bridging virtual and physical world. Since participants do not directly observe these robotic proxies, we investigate the multiple mappings between physical robots and virtual proxies that can utilize the resources needed to provide a well rounded VR experience. PhyShare bots can act either as directly touchable objects or invisible carriers of physical objects, depending on different scenarios. They also support distributed collaboration, allowing remotely located VR collaborators to share the same physical feedback.

[1]  Greg Welch,et al.  The office of the future: a unified approach to image-based modeling and spatially immersive displays , 1998, SIGGRAPH.

[2]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[3]  Charles T. Loop,et al.  Holoportation: Virtual 3D Teleportation in Real-time , 2016, UIST.

[4]  Chris Schmandt,et al.  MetaSpace: Full-body Tracking for Immersive Multiperson Virtual Reality , 2015, UIST.

[5]  Pierre Dragicevic,et al.  Zooids: Building Blocks for Swarm User Interfaces , 2016, UIST.

[6]  Sean Follmer,et al.  Wolverine: A Wearable Haptic Interface for Grasping in VR , 2016, UIST.

[7]  Misha Sra Asymmetric Design Approach and Collision Avoidance Techniques For Room-scale Multiplayer Virtual Reality , 2016, UIST.

[8]  Eyal Ofek,et al.  Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences , 2016, CHI.

[9]  Dhruv Jain,et al.  Resolving Spatial Variation And Allowing Spectator Participation In Multiplayer VR , 2016, UIST.

[10]  Kuu-Young Young,et al.  Force reflection and manipulation for a VR-based telerobotic system , 2000 .

[11]  Takeo Kanade,et al.  What you can see is what you can feel-development of a visual/haptic interface to virtual environment , 1996, Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium.

[12]  Hiroo Iwata,et al.  CirculaFloor [locomotion interface] , 2005, IEEE Computer Graphics and Applications.

[13]  Masahiko Inami,et al.  Remote active tangible interactions , 2007, TEI.

[14]  Ken Perlin,et al.  Physical objects as bidirectional user interface elements , 2004, IEEE Computer Graphics and Applications.

[15]  Takeshi Naemura,et al.  SkyAnchor: Optical Design for Anchoring Mid-air Images onto Physical Objects , 2016, UIST.

[16]  Thijs Roumen,et al.  TurkDeck: Physical Virtual Reality Based on People , 2015, UIST.

[17]  Kasper Hornbæk,et al.  Tangible bots: interaction with active tangibles in tabletop interfaces , 2011, CHI.

[18]  Anthony Tang,et al.  Toward a Framework for Prototyping Physical Interfaces in Multiplayer Gaming: TwinSpace Experiences , 2011, ICEC.

[19]  Helge J. Ritter,et al.  An integrated multi-modal actuated tangible user interface for distributed collaborative planning , 2012, Tangible and Embedded Interaction.

[20]  Kent L. Norman,et al.  Development of an instrument measuring user satisfaction of the human-computer interface , 1988, CHI '88.

[21]  Scott Brenner Brave Tangible interfaces for remote communication and collaboration , 1998 .

[22]  Hiroshi Ishii,et al.  Tangible bits: beyond pixels , 2008, TEI.

[23]  Fumihisa Shibata,et al.  MAI painting brush: an interactive device that realizes the feeling of real painting , 2010, UIST.

[24]  Tobias Höllerer,et al.  World-stabilized annotations and virtual scene navigation for remote collaboration , 2014, UIST.

[25]  Fumihisa Shibata,et al.  MAI painting brush++: augmenting the feeling of painting with new visual and tactile feedback mechanisms , 2011, UIST '11 Adjunct.

[26]  Hiroshi Ishii,et al.  Shape Displays: Spatial Interaction with Dynamic Physical Form , 2015, IEEE Computer Graphics and Applications.

[27]  Hiroshi Ishii,et al.  inFORM: dynamic physical affordances and constraints through shape and object actuation , 2013, UIST.

[28]  Eyal Ofek,et al.  NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers , 2016, UIST.

[29]  Hiroshi Ishii,et al.  Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration , 2014, UIST.

[30]  Daniel J. Wigdor,et al.  Snake Charmer: Physically Enabling Virtual Objects , 2016, TEI.

[31]  Scott R. Klemmer,et al.  Two worlds apart: bridging the gap between physical and virtual media for distributed design collaboration , 2003, CHI '03.

[32]  William A. McNeely,et al.  Robotic graphics: a new approach to force feedback for virtual reality , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[33]  W. Keith Edwards,et al.  TwinSpace: an infrastructure for cross-reality team spaces , 2010, UIST.