Teaching by touching: An intuitive method for development of humanoid robot motions

This paper investigates touching as a natural way for humans to communicate with robots. In particular we developed a system to edit motions of a small humanoid robot by touching its body parts. This interface has two purposes: it allows the user to develop robot motions in a very intuitive way, and it allows us to collect data useful for studying the characteristics of touching as a means of communication. Experimental results confirm the interface's ease of use for inexpert users, and analysis of the data collected during human-robot teaching episodes has yielded several useful insights.

[1]  Masayuki Inaba,et al.  Learning by watching: extracting reusable task knowledge from visual observation of human performance , 1994, IEEE Trans. Robotics Autom..

[2]  James E. Bobrow,et al.  Modeling, Identification, and Control of a Pneumatically Actuated, Force Controllable Robot , 1996 .

[3]  Kazuhiro Kosuge,et al.  Dance partner robot - Ms DanceR , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[4]  Masaki Ogino,et al.  HMDP: A New Protocol for Motion Pattern Generation Towards Behavior Abstraction , 2007, RoboCup.

[5]  Takashi Minato,et al.  Evaluating the human likeness of an android by comparing gaze behaviors elicited by the android and a person , 2006, Adv. Robotics.

[6]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[7]  A. E. Hoerl,et al.  Ridge regression: biased estimation for nonorthogonal problems , 2000 .

[8]  Richard Corbett,et al.  AROMA: ambient awareness through olfaction in a messaging application , 2004, ICMI '04.

[9]  Brett Browning,et al.  Learning by demonstration with critique from a human teacher , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[10]  Adrian Stoica,et al.  Humanoids for lunar and planetary surface operations , 2005, Humanoids.

[11]  G. Hirzinger,et al.  Touch: The direct type of human interaction with a redundant service robot , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[12]  Kazuhito Yokoi,et al.  Imitating human dance motions through motion structure analysis , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Ian R. Fasel,et al.  The RUBI project: A progress report , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Mark Steedman,et al.  Animated conversation: rule-based generation of facial expression, gesture & spoken intonation for multiple conversational agents , 1994, SIGGRAPH.

[15]  David J. Spiegelhalter,et al.  Machine Learning, Neural and Statistical Classification , 2009 .

[16]  Cynthia Breazeal,et al.  Proceedings of the Second ACM SIGCHI/SIGART Conference on Human-Robot Interaction, HRI 2007, Arlington, Virginia, USA, March 10-12, 2007 , 2007, HRI.

[17]  Aiko M. Hormann,et al.  Programs for Machine Learning. Part I , 1962, Inf. Control..

[18]  Ryohei Nakatsu,et al.  Realization of tai-chi motion using a humanoid robot - Physical interactions with humanoid robot , 2004, IFIP Congress Topical Sessions.

[19]  Pradeep K. Khosla,et al.  Tactile gestures for human/robot interaction , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[20]  Huan Liu,et al.  Book review: Machine Learning, Neural and Statistical Classification Edited by D. Michie, D.J. Spiegelhalter and C.C. Taylor (Ellis Horwood Limited, 1994) , 1996, SGAR.

[21]  Yangsheng Xu,et al.  Online, interactive learning of gestures for human/robot interfaces , 1996, Proceedings of IEEE International Conference on Robotics and Automation.