The Haptic Museum

Our IMSC team has used haptics to allow museum visitors to explore three-dimensional works of art by “touching” them, something that is not possible in ordinary museums due to prevailing “hands-off” policies [1, 2]. Haptics involves the modality of touch--the sensation of shape and texture an observer feels when exploring a virtual object, such as a three-dimensional model of a piece of pottery or art glass[3, 4, 5]. The haptic devices used in our research are the Phantom [6] and the CyberGrasp [7]. The Phantom is a desktop device that provides force feedback to the user's fingertip. The image on the left, below, shows a researcher at IMSC exploring the surface of a virtual teapot from USC’s Fisher Gallery using a Phantom. The image on the right shows the researcher calibrating the CyberGrasp, a whole-hand force-feedback glove that can be used to grasp virtual objects. A network of “tendons” transmits grasp forces back to the user’s fingers and palm. Both of these devices can be used by a remote museum visitor retrieving the model of the art object over the Internet or other network. Our mission is to develop seamless, device-independent haptic collaboration such that a museum staff member and a museum-goer or art student at a remote location can jointly examine a vase or bronze figure, note its interesting contours and textures, and consider such questions as "Why did the artist make this side rough but that side smooth?" or "What is this indentation on the bottom for?"

[1]  A. Stephen Morse,et al.  Control Using Logic-Based Switching , 1997 .

[2]  Bengt Mårtensson,et al.  The order of any stabilizing regulator is sufficient a priori information for adaptive stabilization , 1985 .

[3]  Peter Corke,et al.  VISUAL CONTROL OF ROBOT MANIPULATORS – A REVIEW , 1993 .

[4]  Joao P. Hespanha,et al.  Single-camera visual servoing , 2000, Proceedings of the 39th IEEE Conference on Decision and Control (Cat. No.00CH37187).

[5]  Gregory D. Hager,et al.  Robot hand-eye coordination based on stereo vision , 1995 .

[6]  A. Morse Supervisory control of families of linear set-point controllers Part I. Exact matching , 1996, IEEE Trans. Autom. Control..

[7]  John Kenneth Salisbury,et al.  A constraint-based god-object method for haptic display , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[8]  Cagatay Basdogan,et al.  Haptics in virtual environments: taxonomy, research status, and challenges , 1997, Comput. Graph..

[9]  Blake Hannaford,et al.  Force Feedback in Virtual and Shared Environments , 1995 .

[10]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[11]  K. Gu,et al.  Delay effects on stability: a survey , 1999, Proceedings of the 38th IEEE Conference on Decision and Control (Cat. No.99CH36304).

[12]  Cyrus Shahabi,et al.  Immersidata Management: Challenges in Management of Data Generated within an Immersive Environment , 1999, Multimedia Information Systems.

[13]  Ming C. Lin,et al.  Collision Detection between Geometric Models: A Survey , 1998 .

[14]  Joao P. Hespanha,et al.  Logic-based switching algorithms in control , 1998 .

[15]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[16]  Gaurav S. Sukhatme,et al.  Smoother based 3D attitude estimation for mobile robot localization , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[17]  J. Edward Colgate,et al.  Issues in the haptic display of tool use , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[18]  Cagatay Basdogan,et al.  The role of Haptic communication in shared virtual environments , 1998 .

[19]  Jered Floyd,et al.  Haptic interaction with three-dimensional bitmapped virtual environments. , 1999 .