Using spatial and temporal contrast for fluent robot-human hand-overs

For robots to get integrated in daily tasks assisting humans, robot-human interactions will need to reach a level of fluency close to that of human-human interactions. In this paper we address the fluency of robot-human hand-overs. From an observational study with our robot HERB, we identify the key problems with a baseline hand-over action. We find that the failure to convey the intention of handing over causes delays in the transfer, while the lack of an intuitive signal to indicate timing of the hand-over causes early, unsuccessful attempts to take the object. We propose to address these problems with the use of spatial contrast, in the form of distinct hand-over poses, and temporal contrast, in the form of unambiguous transitions to the hand-over pose. We conduct a survey to identify distinct hand-over poses, and determine variables of the pose that have most communicative potential for the intent of handing over. We present an experiment that analyzes the effect of the two types of contrast on the fluency of hand-overs. We find that temporal contrast is particularly useful in improving fluency by eliminating early attempts of the human.

[1]  Alois Knoll,et al.  Human-robot interaction in handing-over tasks , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[2]  John Lasseter,et al.  Principles of traditional animation applied to 3D computer animation , 1987, SIGGRAPH.

[3]  Geoffrey A. Hollinger,et al.  HERB: a home exploring robotic butler , 2010, Auton. Robots.

[4]  Rachid Alami,et al.  Grasp Planning for Interactive Object Manipulation , 2006 .

[5]  Hisato Kobayashi,et al.  Sound design for emotion and intention expression of socially interactive robots , 2010, Intell. Serv. Robotics.

[6]  Cynthia Breazeal,et al.  Cost-Based Anticipatory Action Selection for Human–Robot Fluency , 2007, IEEE Transactions on Robotics.

[7]  Kanya Tanaka,et al.  Experimental analysis of handing over , 1995, Proceedings 4th IEEE International Workshop on Robot and Human Communication.

[8]  Hikaru Inooka,et al.  Motion planning for hand-over between human and robot , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[9]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[10]  T. Nakata,et al.  Expression of Emotion and Intention by Robot Body Movement , 1998 .

[11]  Dare A. Baldwin,et al.  Evidence for ‘motionese’: modifications in mothers’ infant-directed action , 2002 .

[12]  Sandra Hirche,et al.  Investigating Human-Human Approach and Hand-Over , 2009, Human Centered Robot Systems, Cognition, Interaction, Technology.

[13]  Siddhartha S. Srinivasa,et al.  Manipulation planning on constraint manifolds , 2009, 2009 IEEE International Conference on Robotics and Automation.

[14]  Takayuki Kanda,et al.  Nonverbal leakage in robots: Communication of intentions through seemingly unintentional behavior , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Hikaru Inooka,et al.  Hand-over of an object between human and robot , 1992, [1992] Proceedings IEEE International Workshop on Robot and Human Communication.

[16]  Rachid Alami,et al.  Exploratory Study of a Robot Approaching a Person in the Context of Handing Over an Object , 2007, AAAI Spring Symposium: Multidisciplinary Collaboration for Socially Assistive Robotics.

[17]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[18]  Huan Liu,et al.  Chi2: feature selection and discretization of numeric attributes , 1995, Proceedings of 7th IEEE International Conference on Tools with Artificial Intelligence.

[19]  Andrea H. Mason,et al.  Grip forces when passing an object to a partner , 2005, Experimental Brain Research.

[20]  Quinn McNemar,et al.  Psychological statistics, 2nd ed. , 1955 .

[21]  Rachid Alami,et al.  Spatial reasoning for human robot interaction , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Kazuo Tanie,et al.  Human interaction with a service robot: mobile-manipulator handing over an object to a human , 1997, Proceedings of International Conference on Robotics and Automation.

[23]  Tiffany L. Chen,et al.  Hand it over or set it down: A user study of object delivery with an assistive mobile manipulator , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[24]  Hideo Tsukune,et al.  Delivery by hand between human and robot based on fingertip force-torque information , 1998, Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190).

[25]  M. Mancini,et al.  Real-time analysis and synthesis of emotional gesture expressivity , 2007 .

[26]  Tetsuo Ono,et al.  Body Movement Analysis of Human-Robot Interaction , 2003, IJCAI.

[27]  Charles C. Kemp,et al.  Human-Robot Interaction for Cooperative Manipulation: Handing Objects to One Another , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[28]  Sebastian Thrun,et al.  Spontaneous, short-term interaction with mobile robots , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).