Evaluation of virtual fixtures for a robot programming by demonstration interface

We investigate the effectiveness of several types of virtual fixtures in a robot programming by demonstration interface. We show that while all types of virtual fixtures examined yield a significant reduction in the number of errors in tight tolerance peg-in-hole tasks, color and sound fixtures generally outperform a tactile fixture in terms of both execution time of successful trials and error rate. We have found also that when users perceive that the task is very difficult but the system is providing some help by means of a virtual fixture, they tend to spend more time trying to achieve a successful task execution. Thus, for difficult tasks the benefits of virtual fixturing are better reflected in a reduction of the error rate than in a decreased execution time. We conjecture that these trends are related to the limitations of currently available interfaces for human-robot interaction through virtual environments and to the different strategies adopted by the users to cope with such limitations in high-accuracy tasks.

[1]  Takashi Suehiro,et al.  Teaching by demonstration of assembly motion in VR - non-deterministic search-type motion in the teaching stage , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Chang-Yih Shing,et al.  The study of auditory and haptic signals in a virtual reality-based hand rehabilitation system , 2003, Robotica.

[3]  Tomoichi Takahashi,et al.  Robotic assembly operation teaching in a virtual environment , 1994, IEEE Trans. Robotics Autom..

[4]  Jack Tigh Dennerlein,et al.  Force-feedback improves performance for steering and combined steering-targeting tasks , 2000, CHI.

[5]  Rakesh Gupta,et al.  Experiments Using Multimodal Virtual Environments in Design for Assembly Analysis , 1997, Presence: Teleoperators & Virtual Environments.

[6]  S. J. Steiner,et al.  Virtual Reality: A Tool for Assembly? , 2000, Presence: Teleoperators & Virtual Environments.

[7]  Allison M. Okamura,et al.  Recognition of operator motions for real-time assistance using virtual fixtures , 2003, 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003. Proceedings..

[8]  H. Tominaga,et al.  Acquiring manipulation skills through observation , 1999, Proceedings. 1999 IEEE/SICE/RSJ. International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI'99 (Cat. No.99TH8480).

[9]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[10]  Philippe Coiffet,et al.  Virtual Reality Technology , 2006 .

[11]  Louis B. Rosenberg,et al.  Virtual fixtures: Perceptual tools for telerobotic manipulation , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[12]  Stefano Caselli,et al.  Leveraging on a virtual environment for robot programming by demonstration , 2004, Robotics Auton. Syst..

[13]  I. Scott MacKenzie,et al.  Extending Fitts' law to two-dimensional tasks , 1992, CHI.

[14]  Tomoichi Takahashi,et al.  Teaching robot's movement in virtual reality , 1991, Proceedings IROS '91:IEEE/RSJ International Workshop on Intelligent Robots and Systems '91.

[15]  Maja J. Mataric,et al.  Performance-Derived Behavior Vocabularies: Data-Driven Acquisition of Skills from Motion , 2004, Int. J. Humanoid Robotics.

[16]  Rüdiger Dillmann,et al.  Understanding users intention: programming fine manipulation tasks by demonstration , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Rick Kazman,et al.  Vibrotactile feedback in delicate virtual reality operations , 1997, MULTIMEDIA '96.

[18]  Craig Sayers,et al.  Remote control robotics , 1998 .

[19]  Shahram Payandeh,et al.  On application of virtual fixtures as an aid for telemanipulation and training , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[20]  Katsushi Ikeuchi,et al.  Acquiring hand-action models by attention point analysis , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[21]  Kay M. Stanney,et al.  Deriving haptic design guidelines from human physiological, psychophysical, and neurological foundations , 2004, IEEE Computer Graphics and Applications.

[22]  Shumin Zhai,et al.  Collaboration Meets Fitts' Law: Passing Virtual Objects with and without Haptic Force Feedback , 2003, INTERACT.

[23]  Katsushi Ikeuchi,et al.  Toward an assembly plan from observation. I. Task recognition with polyhedral objects , 1994, IEEE Trans. Robotics Autom..

[24]  Don H. Johnson,et al.  An Operator Interface for Teleprogramming Employing Synthetic Fixtures , 1994, Presence Teleoperators Virtual Environ..

[25]  Dinesh K. Pai,et al.  Programming contact tasks using a reality-based virtual environment integrated with vision , 1999, IEEE Trans. Robotics Autom..