Haptobend: shape-changing passive haptic feedback in virtual reality

We present HaptoBend, a novel shape-changing input device providing passive haptic feedback (PHF) for a wide spectrum of objects in virtual reality (VR). Past research in VR shows that PHF increases presence and improves user task performance. However, providing PHF for multiple objects usually requires complex, immobile systems, or multiple props. HaptoBend addresses this problem by allowing users to bend the device into 2D plane-like shapes and multi-surface 3D shapes. We believe HaptoBend's physical approximations of virtual objects can provide realistic haptic feedback through research demonstrating the dominance of human vision over other senses in VR. To test the effectiveness of HaptoBend in matching 2D planar and 3D multi-surface shapes, we conducted an experiment modeled after gesture elicitation studies with 20 participants. High goodness and ease scores show shape-changing passive haptic devices, like HaptoBend, are an effective approach to generalized haptics. Further analysis supports the use of physical approximations for realistic haptic feedback.

[1]  Gerard Jounghyun Kim,et al.  Effects of sizes and shapes of props in tangible augmented reality , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[2]  Antonio Krüger,et al.  Shifty: A Weight-Shifting Dynamic Passive Haptic Proxy to Enhance Object Perception in Virtual Reality , 2017, IEEE Transactions on Visualization and Computer Graphics.

[3]  Simon Richir,et al.  Influence of control/display ratio on the perception of mass of manipulated objects in virtual environments , 2005, IEEE Proceedings. VR 2005. Virtual Reality, 2005..

[4]  Michael R. M. Jenkin,et al.  Evaluating haptic feedback in virtual environments using ISO 9241–9 , 2010, 2010 IEEE Virtual Reality Conference (VR).

[5]  Anatole Lécuyer,et al.  Reconfigurable tangible devices for 3D virtual object manipulation by single or multiple users , 2010, VRST '10.

[6]  P BrooksFrederick What's Real About Virtual Reality? , 1999 .

[7]  Minkyung Lee,et al.  Multimodal Speech-Gesture Interaction with 3D Objects in Augmented Reality Environments , 2010 .

[8]  Johannes Schöning,et al.  Paddle: highly deformable mobile devices with physical controls , 2014, CHI Extended Abstracts.

[9]  Robert W. Lindeman,et al.  Hand-held windows: towards effective 2D interaction in immersive virtual environments , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[10]  Tobias Isenberg,et al.  Mouse, Tactile, and Tangible Input for 3D Manipulation , 2016, CHI.

[11]  Roel Vertegaal,et al.  PaperFold: Evaluating Shape Changes for Viewport Transformations in Foldable Thin-Film Display Devices , 2015, Tangible and Embedded Interaction.

[12]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, International Conference on Human Factors in Computing Systems.

[13]  Xu Jia,et al.  How users manipulate deformable displays as input devices , 2010, CHI.

[14]  Sriram Subramanian,et al.  UltraHaptics: multi-point mid-air haptic feedback for touch surfaces , 2013, UIST.

[15]  Teemu Tuomas Ahmaniemi,et al.  What is a device bend gesture really good for? , 2014, CHI.

[16]  Sean Follmer,et al.  Wolverine: A wearable haptic interface for grasping in virtual reality , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[17]  Mel Slater,et al.  Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[18]  Richard D. Joyce,et al.  Passive Haptics to Enhance Virtual Reality Simulations , 2017 .

[19]  Grigore C. Burdea,et al.  The Rutgers Master II-new design force-feedback glove , 2002 .

[20]  Modris Greitans,et al.  Acceleration and Magnetic Sensor Network for Shape Sensing , 2016, IEEE Sensors Journal.

[21]  Joseph J. LaViola,et al.  Towards user-defined multi-touch gestures for 3D objects , 2013, ITS.

[22]  Hans-Werner Gellersen,et al.  Substitutional Reality: Using the Physical Environment to Design Virtual Reality Experiences , 2015, CHI.

[23]  Naphtali Rishe,et al.  Gesture elicitation for 3D travel via multi-touch and mid-Air systems for procedurally generated pseudo-universe , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[24]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[25]  Masa Inakage,et al.  Ninja track: design of electronic toy variable in shape and flexibility , 2013, TEI '13.

[26]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, CHI '94.

[27]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[28]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[29]  David Kim,et al.  FlexSense: a transparent self-sensing deformable surface , 2014, UIST.

[30]  Wendy E. Mackay,et al.  BiTouch and BiPad: designing bimanual interaction for hand-held tablets , 2012, CHI.

[31]  Daniel F. Keefe,et al.  A Lightweight Tangible 3D Interface for Interactive Visualization of Thin Fiber Structures , 2013, IEEE Transactions on Visualization and Computer Graphics.

[32]  Hunter G. Hoffman,et al.  Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments , 1998, Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).

[33]  Frederick P. Brooks What's Real About Virtual Reality? , 1999, IEEE Computer Graphics and Applications.

[34]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[35]  Sriram Subramanian,et al.  Mid-Air Haptics and Displays: Systems for Un-instrumented Mid-air Interactions , 2016, CHI Extended Abstracts.

[36]  Mary C. Whitton,et al.  Passive haptics significantly enhances virtual environments , 2001 .

[37]  Makoto Sato,et al.  Development of String-based Force Display: SPIDAR , 2002 .

[38]  C. Botella,et al.  Virtual reality in the treatment of spider phobia: a controlled study. , 2002, Behaviour research and therapy.

[39]  Roel Vertegaal,et al.  PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays , 2011, CHI.

[40]  M. Alexa,et al.  Combining Shape-Changing Interfaces and Spatial Augmented Reality Enables Extended Object Appearance , 2016, CHI.

[41]  Hiroshi Ishii,et al.  LineFORM: Actuated Curve Interfaces for Display, Interaction, and Constraint , 2015, UIST.

[42]  Frederick P. Brooks,et al.  Project GROPEHaptic displays for scientific visualization , 1990, SIGGRAPH.

[43]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[44]  S. Weghorst,et al.  Virtual reality and tactile augmentation in the treatment of spider phobia: a case report. , 1997, Behaviour research and therapy.

[45]  J.C. Perry,et al.  Upper-Limb Powered Exoskeleton Design , 2007, IEEE/ASME Transactions on Mechatronics.

[46]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[47]  Eyal Ofek,et al.  Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences , 2016, CHI.