Displaying haptic interaction with a synthesized model of real objects using SPIDAR-mouse

We have developed a haptic interaction system that enables us to interact with a synthesized model of real objects using the SPIDAR-mouse. Although the SPIDAR-mouse is convenient for displaying haptic feedback in a PC environment, the force presented with the SPIDAR-mouse is limited to 2D lateral force. Although a gradient-based method is typically used for generating 2D lateral force in haptic rendering, it did not have sufficient performance in representing geometric shapes. We therefore introduced an impulse-based method to the haptic rendering in representing bump information with 2D lateral force. In addition, a polygon reduction function was implemented to reduce the number of polygons in the synthesized model, improving performance in response. A preliminary experiment was conducted to evaluate the haptic interaction system. The results indicated that users were able to distinguish 3D shapes with distinctive geometric conditions, but had difficulty distinguishing fine differences such as similar types of bumps.

[1]  Susan J. Lederman,et al.  The perceived roughness of resistive virtual textures: I. rendering by a force-feedback mouse , 2006, TAP.

[2]  Mitchel Resnick,et al.  Extending tangible interfaces for education: digital montessori-inspired manipulatives , 2005, CHI.

[3]  Aiguo Song,et al.  Image-based haptic texture rendering , 2010, VRCAI '10.

[4]  Vincent Hayward,et al.  Force can overcome object geometry in the perception of shape through active touch , 2001, Nature.

[5]  Scott R. Klemmer,et al.  How bodies matter: five themes for interaction design , 2006, DIS '06.

[6]  Ravi Rastogi,et al.  An improved, low-cost tactile 'mouse' for use by individuals who are blind and visually impaired , 2009, Assets '09.

[7]  Manfred Huber,et al.  Design and evaluation of haptic effects for use in a computer desktop for the physically disabled , 2010, Universal Access in the Information Society.

[8]  Flavio Prieto,et al.  Multi-modal exploration of small artifacts: an exhibition at the Gold Museum in Bogota , 2009, VRST '09.

[9]  Stephen A. Brewster,et al.  Towards the Temporally Perfect Virtual Button: Touch-Feedback Simultaneity and Perceived Quality in Mobile Touchscreen Press Interactions , 2014, TAP.

[10]  Martin Hachet,et al.  Touch-Based Interfaces for Interacting with 3D Content in Public Exhibitions , 2013, IEEE Computer Graphics and Applications.

[11]  Jochen Lang,et al.  Haptic rendering of photographs , 2012, 2012 IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE 2012) Proceedings.

[12]  Dinesh Manocha,et al.  Six-Degree-of-Freedom Haptic Rendering Using Incremental and Localized Computations , 2003, Presence: Teleoperators & Virtual Environments.

[13]  F. Brooks,et al.  Feeling and seeing: issues in force display , 1990, ACM Symposium on Interactive 3D Graphics and Games.

[14]  David P. Luebke,et al.  View-dependent simplification of arbitrary polygonal environments , 1997, SIGGRAPH.

[15]  Matthew Butler,et al.  Object appreciation through haptic interaction , 2008 .

[16]  Koichiro Deguchi,et al.  Lateral-force-based 2.5-dimensional tactile display for touch screen , 2012, 2012 IEEE Haptics Symposium (HAPTICS).

[17]  日向 俊二 Kinect for Windowsアプリを作ろう , 2012 .

[18]  Hugues Hoppe,et al.  Progressive meshes , 1996, SIGGRAPH.

[19]  Dinesh Manocha,et al.  Appearance-preserving simplification , 1998, SIGGRAPH.

[20]  Motoyuki Akamatsu,et al.  A multi-modal mouse with tactile and force feedback , 1994, Int. J. Hum. Comput. Stud..

[21]  John Kenneth Salisbury,et al.  A constraint-based god-object method for haptic display , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[22]  Jane Wilhelms,et al.  Collision Detection and Response for Computer Animation , 1988, SIGGRAPH.