Haptic rendering with the Toolhandle haptic interface

This thesis describes the design of the hardware and software for a haptic interface system. A haptic interface allows a human \observer" to explore and interact with a virtual environment using the sense of touch. Haptic interface systems include three main components: the haptic interface (usually an electro-mechanical system capable of exerting forces on a user), a model of the environment to be touched, and a rendering algorithm which unites the rst two by generating the feedback forces based on the environment model. This thesis focuses on the rst and third of these components: a haptic interface, the MIT-Toolhandle, and haptic rendering algorithms for simulating general real-world virtual environments. The MIT-Toolhandle is a ground-based force-feedback device designed to allow subjects to use tools to interact with virtual environments. One of the di culties of haptic interfaces is simulating both the human's kinesthetic and tactile system. Tool interactions are interesting because they passively satisfy the human's tactile sense. Since the user holds a physical tool the simulation can be reduced to simulating the interactions between a tool and the environment. The description of the MIT-Toolhandle is accompanied by a history of previous haptic interfaces, and some insight into the psychophysical issues involved in the synthesis of haptic feedback. The rendering algorithms described fall into two major categories: those for coarse geometry and those for surface e ects. We will examine vector eld and god-object methods for representing geometry. The weaknesses of the vector eld method are uncovered, and we explain how the godobject method handles these di culties. Lastly, extensions to the god-object method for friction, texture, and surface smoothing will be described. Thesis Supervisor: J. Kenneth Salisbury Title: Principle Research Scientist

[1]  Grigore C. Burdea,et al.  A Portable Dextrous Master with Force Feedback , 1992, Presence: Teleoperators & Virtual Environments.

[2]  Lin,et al.  Fast Collision Detection between Geometric Models TR 93-004 January , 1993 , 2010 .

[3]  Jean-Jacques E. Slotine,et al.  Robot analysis and control , 1988, Autom..

[4]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[5]  Karun B. Shimoga,et al.  A survey of perceptual feedback issues in dexterous telemanipulation. I. Finger force feedback , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[6]  Hong Z. Tan,et al.  HUMAN FACTORS FOR THE DESIGN OF FORCE-REFLECTING HAPTIC INTERFACES , 1994 .

[7]  Susan J. Lederman,et al.  Computational haptics: the sandpaper system for synthesizing texture for a force-feedback display , 1995 .

[8]  James F. Blinn,et al.  Simulation of wrinkled surfaces , 1978, SIGGRAPH.

[9]  John Kenneth Salisbury,et al.  Haptic rendering: programming touch interaction with virtual objects , 1995, I3D '95.

[10]  F. Brooks,et al.  Feeling and seeing: issues in force display , 1990, ACM Symposium on Interactive 3D Graphics and Games.

[11]  James Edward Colgate,et al.  Factors a ecting the Z-range of a haptic display , 1994 .

[12]  David L. Zeltzer,et al.  A New Model for Efficient Dynamic Simulation , 1993 .

[13]  V. Leitáo,et al.  Computer Graphics: Principles and Practice , 1995 .

[14]  John Kenneth Salisbury,et al.  A constraint-based god-object method for haptic display , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[15]  S. E. Salcudean,et al.  On the Emulation of Stiff Walls and Static Friction with a Magnetically Levitated Input/Output Devic , 1997 .