Teaching robot's movement in virtual reality

The authors propose a robot teaching method which uses virtual reality. An operator wearing a VPL DataGlove performs a task in a virtual workspace which simulates a real workspace. The operator's movements are recognized that interpreted task-dependently using interpretation rules and a world model. A robot executes the task in the real workspace using sensors. The overall system architecture and experiments are presented.<<ETX>>

[1]  Masayuki Inaba,et al.  Design and implementation of a system that generates assembly programs from visual recognition of human action sequences , 1990, EEE International Workshop on Intelligent Robots and Systems, Towards a New Frontier of Applications.

[2]  Tomoichi Takahashi,et al.  Hand gesture coding based on experiments using a hand gesture interface device , 1991, SGCH.

[3]  Hiroshi G. Okuno,et al.  TAO: a harmonic mean of Lisp, Prolog and Smalltalk , 1983, SIGP.

[4]  Ikuo Takeuchi,et al.  Image based operation: a human-robot interaction architecture for intelligent manufacturing , 1989, 15th Annual Conference of IEEE Industrial Electronics Society.

[5]  H. Harry Asada,et al.  The direct teaching of tool manipulation skills via the impedance identification of human motions , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[6]  Lucy Pao,et al.  Transformation of human hand positions for robotic hand control , 1989, Proceedings, 1989 International Conference on Robotics and Automation.

[7]  Tomás Lozano-Pérez,et al.  Task-level planning of pick-and-place robot motions , 1989, Computer.

[8]  Michael A. Wesley,et al.  AUTOPASS: An Automatic Programming System for Computer Controlled Mechanical Assembly , 1977, IBM J. Res. Dev..