WYSIWYF Display: A Visual/Haptic Interface to Virtual Environment

To build a VR training system for visuomotor skills, an image displayed by a visual interface should be correctly registered to a haptic interface so that the visual sensation and the haptic sensation are both spatially and temporally consistent. In other words, it is desirable that what you see is what you feel (WYSIWYF). In this paper, we propose a method that can realize correct visual/haptic registration, namely WYSIWYF, by using a vision-based, object-tracking technique and a video-keying technique. Combining an encountered-type haptic device with a motion-command-type haptic rendering algorithm makes it possible to deal with two extreme cases (free motion and rigid constraint). This approach provides realistic haptic sensations, such as free-to-touch and move-and-collide. We describe a first prototype and illustrate its use with several demonstrations. The user encounters the haptic device exactly when his or her hand reaches a virtual object in the display. Although this prototype has some remaining technical problems to be solved, it serves well to show the validity of the proposed approach.

[1]  Massimo Bergamasco,et al.  An arm exoskeleton system for teleoperation and virtual environments applications , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[2]  M HAMMERTON,et al.  TRANSFER OF TRAINING BETWEEN SPACE-ORIENTED AND BODY-ORIENTED CONTROL SITUATIONS. , 1964, British journal of psychology.

[3]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[4]  Takeo Kanade,et al.  Development of a Video-Rate Stereo Machine , 1997 .

[5]  William A. McNeely,et al.  Robotic graphics: a new approach to force feedback for virtual reality , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[6]  Makoto Sato,et al.  Space interface device for artificial reality - SPIDAR , 1992, Systems and Computers in Japan.

[7]  Michitaka Hirose,et al.  Simulation and presentation of curved surface in virtual reality environment through surface display , 1995, Proceedings Virtual Reality Annual International Symposium '95.

[8]  Peter A. Hancock,et al.  Transfer of training from virtual reality , 1993 .

[9]  F. Brooks,et al.  Feeling and seeing: issues in force display , 1990, ACM Symposium on Interactive 3D Graphics and Games.

[10]  Tsuneo Yoshikawa,et al.  A touch and force display system for haptic interface , 1997, Proceedings of International Conference on Robotics and Automation.

[11]  .. McGlone,et al.  Vision-Based Object Registration for Real-Time Image Overlay , 1995 .

[12]  Tsuneo Yoshikawa,et al.  A Touch and Force Display System for Haptic Interface , 1998 .

[13]  Ronald Azuma,et al.  Improving static and dynamic registration in an optical see-through HMD , 1994, SIGGRAPH.

[14]  Tsuneo Yoshikawa,et al.  Virtual lesson and its application to virtual calligraphy system , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[15]  Frank Biocca,et al.  Quantification of adaptation to virtual-eye location in see-thru head-mounted displays , 1995, Proceedings Virtual Reality Annual International Symposium '95.

[16]  D. Baraff An Introduction to Physically Based Modeling : Rigid Body Simulation II — Nonpenetration Constraints , 1997 .

[17]  Tsuneo Yoshikawa,et al.  Display of Feel for the Manipulation of Dynamic Virtual Objects , 1995 .

[18]  D. H. Warren,et al.  Immediate perceptual response to intersensory discrepancy. , 1980, Psychological bulletin.

[19]  Peter J. Werkhoven,et al.  Visuomotor Adaptation to Virtual Hand Position in Interactive Virtual Environments , 1998, Presence.

[20]  Gregory B. Newby,et al.  Virtual reality: Scientific and technological challenges , 1996 .

[21]  Ronald Azuma,et al.  A frequency-domain analysis of head-motion prediction , 1995, SIGGRAPH.

[22]  Mark A. Livingston,et al.  Superior augmented reality registration by integrating landmark tracking and magnetic tracking , 1996, SIGGRAPH.

[23]  Tsuneo Yoshikawa,et al.  Toward machine mediated training of motor skills. Skill transfer from human to human via virtual environment , 1996, Proceedings 5th IEEE International Workshop on Robot and Human Communication. RO-MAN'96 TSUKUBA.

[24]  David Baraff,et al.  Fast contact force computation for nonpenetrating rigid bodies , 1994, SIGGRAPH.

[25]  Mark A. Livingston,et al.  Managing latency in complex augmented reality systems , 1997, SI3D.

[26]  Henry Sowizral,et al.  Implementation of Dynamic Robotic Graphics For a Virtual Control Panel , 1997, Presence: Teleoperators & Virtual Environments.

[27]  Walter Higgins,et al.  A Comparison of Complementary and Kalman Filtering , 1975, IEEE Transactions on Aerospace and Electronic Systems.

[28]  R. Nakatsu,et al.  Allowable delay between images and tactile information in a haptic interface , 1997, Proceedings. International Conference on Virtual Systems and MultiMedia VSMM '97 (Cat. No.97TB100182).

[29]  Thomas Gary Bishop,et al.  Self-tracker: a smart optical sensor on silicon (vlsi, graphics) , 1984 .

[30]  Michael Deering,et al.  High resolution virtual reality , 1992, SIGGRAPH.

[31]  Takeo Kanade,et al.  Vision-based visual/haptic registration for WYSIWYF display , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[32]  Nathaniel I. Durlach,et al.  Virtual Reality: Scientific and Technological Challenges , 1994 .

[33]  GroenJoris,et al.  Visuomotor Adaptation to Virtual Hand Position in Interactive Virtual Environments , 1998 .

[34]  Yasuyoshi Yokokohji A visual/haptic interface to virtual environment (WYSIWYF display) and its application , 1998, Proceedings 1998 IEEE and ATR Workshop on Computer Vision for Virtual Reality Based Human Communications.

[35]  Timothy D. Lee,et al.  Motor Control and Learning: A Behavioral Emphasis , 1982 .

[36]  Grigore C. Burdea,et al.  Force and Touch Feedback for Virtual Reality , 1996 .

[37]  Michitaka Hirose,et al.  Development of surface display , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[38]  Ronald Azuma,et al.  A demonstrated optical tracker with scalable work area for head-mounted display systems , 1992, I3D '92.

[39]  FuchsHenry,et al.  Merging virtual objects with the real world , 1992 .

[40]  Ulrich Neumann,et al.  Dynamic registration correction in augmented-reality systems , 1995, Proceedings Virtual Reality Annual International Symposium '95.

[41]  S. D. S. Spragg,et al.  Performance on a Two-Dimensional Following Tracking Task with Miniature Stick Control, as a Function of Control-Display Movement Relationships , 1959 .

[42]  Takeo Kanade,et al.  A stereo machine for video-rate dense depth mapping and its new applications , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[43]  Hiroo Iwata,et al.  Artificial reality with force-feedback: development of desktop virtual space with compact master manipulator , 1990, SIGGRAPH.

[44]  Ryutarou Ohbuchi,et al.  Merging virtual objects with the real world: seeing ultrasound imagery within the patient , 1992, SIGGRAPH.

[45]  Takeo Kanade,et al.  What you can see is what you can feel-development of a visual/haptic interface to virtual environment , 1996, Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium.

[46]  Eugenia B. Norris,et al.  Performance on a Following Tracking Task (Modified Sam Two-Hand CoöRdination Test) as a Function of the Relations Between Direction of Rotation of Controls and Direction of Movement of Display , 1953 .

[47]  Paul J. Metzger Adding reality to the virtual , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[48]  Fumio Kishino,et al.  Allowable delay time of images with motion parallax and high-speed image generation , 1991, Other Conferences.

[49]  Tsuneo Yoshikawa,et al.  Bilateral control of master-slave manipulators for ideal kinesthetic coupling-formulation and experiment , 1994, IEEE Trans. Robotics Autom..

[50]  Klaus Radermacher,et al.  Stereoscopic Visualization in Endoscopic Surgery: Problems, Benefits, and Potentials , 1997, Presence: Teleoperators & Virtual Environments.

[51]  Alex Pentland,et al.  Pfinder: real-time tracking of the human body , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[52]  A. Kornheiser Adaptation to laterally displaced vision: a review. , 1976, Psychological bulletin.

[53]  Rainer K. Bernotat,et al.  Rotation of Visual Reference Systems and Its Influence on Control Quality , 1970 .

[54]  Michael Bajura,et al.  Merging Virtual Objects with the Real World , 1992 .