Teaching for multi-fingered robots based on motion intention in virtual reality
暂无分享,去创建一个
[1] Haruhisa Kawasaki,et al. Dexterous anthropomorphic robot hand with distributed tactile sensor: Gifu hand II , 1999, IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028).
[2] Shigeyuki Sakane,et al. A Human-Robot Interface Using an Extended Digital Desk Approach , 1998 .
[3] Rüdiger Dillmann,et al. Building elementary robot skills from human demonstration , 1996, Proceedings of IEEE International Conference on Robotics and Automation.
[4] H. Harry Asada,et al. The direct teaching of tool manipulation skills via the impedance identification of human motions , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.
[5] T. Flash,et al. The coordination of arm movements: an experimentally confirmed mathematical model , 1985, The Journal of neuroscience : the official journal of the Society for Neuroscience.
[6] Tomoichi Takahashi,et al. Robotic assembly operation based on task-level teaching in virtual reality , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.
[7] Katsushi Ikeuchi,et al. Toward automatic robot instruction from perception-temporal segmentation of tasks from human hand motion , 1993, IEEE Trans. Robotics Autom..
[8] Hiroshi Mizoguchi,et al. Active Understanding of Human Intention by a Robot through Monitoring of Human Behavior , 1994, IROS.
[9] Masayuki Inaba,et al. Learning by watching: extracting reusable task knowledge from visual observation of human performance , 1994, IEEE Trans. Robotics Autom..