What you can see is what you can feel-development of a visual/haptic interface to virtual environment

We propose a new concept of visual/haptic interfaces called WYSIWYF display. The proposed concept provides correct visual/haptic registration using a vision based object tracking technique and a video keying technique so that what the user can see via a visual interface is consistent with what he/she can feel through a haptic interface. Using Chroma Keying, a live video image of the user's hand is extracted and blended with the graphic scene of the virtual environment. The user's hand "encounters" the haptic device exactly when his/her hand touches a virtual object in the blended scene. The first prototype has been built and the proposed concept was demonstrated.

[1]  Takeo Kanade,et al.  A stereo machine for video-rate dense depth mapping and its new applications , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[2]  Hiroo Iwata,et al.  Artificial reality with force-feedback: development of desktop virtual space with compact master manipulator , 1990, SIGGRAPH.

[3]  Kazuo Tanie,et al.  A force display system for virtual environments and its evaluation , 1992, [1992] Proceedings IEEE International Workshop on Robot and Human Communication.

[4]  M HAMMERTON,et al.  TRANSFER OF TRAINING BETWEEN SPACE-ORIENTED AND BODY-ORIENTED CONTROL SITUATIONS. , 1964, British journal of psychology.

[5]  Frank Biocca,et al.  Quantification of adaptation to virtual-eye location in see-thru head-mounted displays , 1995, Proceedings Virtual Reality Annual International Symposium '95.

[6]  William A. McNeely,et al.  Robotic graphics: a new approach to force feedback for virtual reality , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[7]  Mamoru Mitsuishi,et al.  Micro Teleoperation System Concentrating Visual and Force Information at Operator's Hand , 1993, ISER.

[8]  Michitaka Hirose,et al.  Simulation and presentation of curved surface in virtual reality environment through surface display , 1995, Proceedings Virtual Reality Annual International Symposium '95.

[9]  Ronald Azuma,et al.  A demonstrated optical tracker with scalable work area for head-mounted display systems , 1992, I3D '92.

[10]  FuchsHenry,et al.  Merging virtual objects with the real world , 1992 .

[11]  S. E. Salcudean,et al.  On the Emulation of Stiff Walls and Static Friction with a Magnetically Levitated Input/Output Devic , 1997 .

[12]  J. Edward Colgate,et al.  Factors affecting the Z-Width of a haptic display , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[13]  .. McGlone,et al.  Vision-Based Object Registration for Real-Time Image Overlay , 1995 .

[14]  Lance Williams,et al.  View Interpolation for Image Synthesis , 1993, SIGGRAPH.

[15]  D. Baraff An Introduction to Physically Based Modeling : Rigid Body Simulation II — Nonpenetration Constraints , 1997 .

[16]  David Baraff,et al.  Fast contact force computation for nonpenetrating rigid bodies , 1994, SIGGRAPH.

[17]  Paul J. Metzger Adding reality to the virtual , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[18]  Ryutarou Ohbuchi,et al.  Merging virtual objects with the real world: seeing ultrasound imagery within the patient , 1992, SIGGRAPH.

[19]  Michael Bajura,et al.  Merging Virtual Objects with the Real World , 1992 .

[20]  Thomas Gary Bishop,et al.  Self-tracker: a smart optical sensor on silicon (vlsi, graphics) , 1984 .