Integrated multimodal interaction using texture representations

In this paper, we explore texture mapping as a unified representation for enabling realistic multimodal interaction with finely detailed surfaces. We show how both normal maps and relief maps can be adopted as unified representations to handle collisions with textured rigid body objects, synthesize complex sound effects from long lasting collisions and perform rendering of haptic textures. The resulting multimodal display system allows a user to see, hear, and feel complex interactions with textured surfaces. By using texture representations for seamlessly integrated multimodal interaction instead of complex triangular meshes otherwise required, this work is able to achieve up to 25 times performance speedup and reduce up to six orders of magnitude in memory storage. We further validate the results through user studies to demonstrate the effectiveness of texture representations for integrated multimodal display. Graphical abstractDisplay Omitted HighlightsWe present a system displaying fine details of visuals, haptics, and audio.We present a new method for rigid-body physics on normal-mapped surfaces.Multimodal interaction with texture maps reduces sensory conflict.Users prefer multimodal relief map interaction over normal maps.

[1]  Cagatay Basdogan,et al.  A Ray-Based Haptic Rendering Technique for Displaying Shape and Texture of 3D Objects in Virtual Environments , 1997, Dynamic Systems and Control.

[2]  Susan J. Lederman,et al.  Computational haptics: the sandpaper system for synthesizing texture for a force-feedback display , 1995 .

[3]  Ming C. Lin,et al.  Sensation preserving simplification for haptic rendering , 2003, ACM Trans. Graph..

[4]  V. Hayward A brief taxonomy of tactile illusions and demonstrations that can be done in a hardware store , 2008, Brain Research Bulletin.

[5]  S. Tachi,et al.  Detailed Shape Representation with Parallax Mapping , 2001 .

[6]  I. Cuthill,et al.  Effect size, confidence interval and statistical significance: a practical guide for biologists , 2007, Biological reviews of the Cambridge Philosophical Society.

[7]  F. Brooks,et al.  Feeling and seeing: issues in force display , 1990, ACM Symposium on Interactive 3D Graphics and Games.

[8]  Chen Shen,et al.  Synthesizing sounds from rigid-body simulations , 2002, SCA '02.

[9]  Ming C. Lin,et al.  Integrated multimodal interaction using normal maps , 2015, Graphics Interface.

[10]  Robert L. Cook,et al.  Shade trees , 1984, SIGGRAPH.

[11]  Russell M. Taylor,et al.  VRPN: a device-independent, network-transparent VR peripheral system , 2001, VRST '01.

[12]  Ming C. Lin,et al.  Example-guided physically based modal sound synthesis , 2013, ACM Trans. Graph..

[13]  Markus H. Gross,et al.  Interactive Haptic Rendering of High-Resolution Deformable Objects , 2007, HCI.

[14]  David M. Chelberg,et al.  Interactive mesostructures , 2013, I3D '13.

[15]  Ming C. Lin,et al.  CLODs: Dual Hierarchies for Multiresolution Collision Detection , 2003, Symposium on Geometry Processing.

[16]  Isabel Navazo,et al.  Hybrid Rugosity Mesostructures (HRMs) for fast and accurate rendering of fine haptic detail , 2010, CLEI Electron. J..

[17]  Manuel Menezes de Oliveira Neto,et al.  Real-time relief mapping on arbitrary polygonal surfaces , 2005, SI3D.

[18]  Dinesh K. Pai,et al.  The Sounds of Physical Shapes , 1998, Presence.

[19]  Dinesh K. Pai,et al.  FoleyAutomatic: physically-based sound effects for interactive simulation and animation , 2001, SIGGRAPH.

[20]  Ming C. Lin,et al.  Synthesizing contact sounds between textured models , 2010, 2010 IEEE Virtual Reality Conference (VR).

[21]  Dinesh K. Pai,et al.  Physically-based Sound Eects for Interactive Simulation and Animation , 2001 .

[22]  James F. Blinn,et al.  Simulation of wrinkled surfaces , 1978, SIGGRAPH.

[23]  Frederick P. Brooks,et al.  Feeling and seeing: issues in force display , 1990, I3D '90.

[24]  Manuel Menezes de Oliveira Neto,et al.  Real-time relief mapping on arbitrary polygonal surfaces , 2005, I3D '05.

[25]  Manuel Menezes de Oliveira Neto,et al.  Relief texture mapping , 2000, SIGGRAPH.

[26]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[27]  Ming C. Lin,et al.  Sensation preserving simplification for haptic rendering , 2005, SIGGRAPH Courses.

[28]  Ming C. Lin,et al.  Interactive sound synthesis for large scale environments , 2006, I3D '06.

[29]  Dinesh Manocha,et al.  Fast collision detection between massive models using dynamic simplification , 2004, SGP '04.

[30]  Dinesh Manocha,et al.  Appearance-preserving simplification , 1998, SIGGRAPH.

[31]  Ming C. Lin,et al.  Haptic display of interaction between textured models , 2004, IEEE Visualization 2004.

[32]  Perry R. Cook,et al.  The Synthesis ToolKit (STK) , 1999, ICMC.

[33]  Ming C. Lin,et al.  Auditory Perception of Geometry-Invariant Material Properties , 2013, IEEE Transactions on Visualization and Computer Graphics.

[34]  László Szirmay-Kalos,et al.  Displacement Mapping on the GPU — State of the Art , 2008 .

[35]  Hans-Peter Seidel,et al.  Maximum mipmaps for fast, accurate, and scalable dynamic height field rendering , 2008, I3D '08.

[36]  Cagatay Basdogan,et al.  Efficient Point-Based Rendering Techniques for Haptic Display of Virtual Objects , 1999, Presence.

[37]  Miguel. A. Otaduy,et al.  Sensation preserving "Simplication for haptic rendering" , 2003 .