Toward Robot Learning of Tool Manipulation from Human Demonstration

Robots that manipulate everyday tools in unstructured, human settings could more easily work with people and perform tasks that are important to people. Task demonstration could serve as an intuitive way for people to program robots to perform tasks. By focusing on task-relevant features during both the demonstration and the execution of a task, a robot could more robustly emulate the important characteristics of the task and generalize what it has learned. In this paper we describe a method for robot task learning that makes use of the perception and control of the tip of a tool. For this approach, the robot monitors the tool’s tip during human use, extracts the trajectory of this task relevant feature, and then manipulates the tool by controlling this feature. We present preliminary results where a humanoid robot learns to clean a flexible hose with a brush. This task is accomplished in an unstructured environment without prior models of the objects or task.

[1]  Oussama Khatib,et al.  A unified approach for motion and force control of robot manipulators: The operational space formulation , 1987, IEEE J. Robotics Autom..

[2]  H. Harry Asada,et al.  The direct teaching of tool manipulation skills via the impedance identification of human motions , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[3]  Phillip J. McKerrow,et al.  Introduction to robotics , 1991 .

[4]  Ales Ude,et al.  Trajectory Reconstruction from Stereo Image Sequences for Teaching Robot Paths , 1993 .

[5]  Masayuki Inaba,et al.  Learning by watching: extracting reusable task knowledge from visual observation of human performance , 1994, IEEE Trans. Robotics Autom..

[6]  Randal C. Nelson,et al.  Visual space task specification, planning and control , 1995, Proceedings of International Symposium on Computer Vision - ISCV.

[7]  Sing Bing Kang,et al.  Robot instruction by human demonstration , 1995 .

[8]  Matthew M. Williamson,et al.  Robot arm control exploiting natural dynamics , 1999 .

[9]  Eric Jones,et al.  SciPy: Open Source Scientific Tools for Python , 2001 .

[10]  Danica Kragic,et al.  Survey on Visual Servoing for Manipulation , 2002 .

[11]  Jessica K. Hodgins,et al.  Generalizing Demonstrated Manipulation Tasks , 2002, WAFR.

[12]  Justus Piater,et al.  Learning Appearance Features to Support Robotic Manipulation , 2002 .

[13]  Jean Ponce,et al.  Computer Vision: A Modern Approach , 2002 .

[14]  Giulio Sandini,et al.  Learning about objects through action - initial steps towards artificial cognition , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[15]  Jeff Weber,et al.  Domo: a force sensing humanoid robot for manipulation research , 2004, 4th IEEE/RAS International Conference on Humanoid Robots, 2004..

[16]  Eric Huber,et al.  Using a hybrid of silhouette and range templates for real-time pose estimation , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[17]  Brian Scassellati,et al.  Motion-based robotic self-recognition , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[18]  Charles C. Kemp,et al.  Visual Tool Tip Detection and Position Estimation for Robotic Manipulation of Unknown Human Tools , 2005 .

[19]  Robert St. Amant,et al.  Tool Use for Autonomous Agents , 2005, AAAI.

[20]  What Can I Control ? : The Development of Visual Categories for a Robot ’ s Body and the World that it Influences , 2006 .

[21]  Maja J. Mataric,et al.  Demonstration-Based Behavior and Task Learning , 2006, AAAI Spring Symposium: To Boldly Go Where No Human-Robot Team Has Gone Before.

[22]  Giorgio Metta,et al.  YARP: Yet Another Robot Platform , 2006 .

[23]  C. Kemp,et al.  Robot Manipulation of Human Tools : Autonomous Detection and Control of Task Relevant Features , 2006 .