Teaching a Robot Bimanual Hand-Clapping Games via Wrist-Worn IMUs

Colleagues often shake hands in greeting, friends connect through high fives, and children around the world rejoice in hand-clapping games. As robots become more common in everyday human life, they will have the opportunity to join in these social-physical interactions, but few current robots are intended to touch people in friendly ways. This article describes how we enabled a Baxter Research Robot to both teach and learn bimanual hand-clapping games with a human partner. Our system monitors the user's motions via a pair of inertial measurement units (IMUs) worn on the wrists. We recorded a labeled library of 10 common hand-clapping movements from 10 participants; this dataset was used to train an SVM classifier to automatically identify hand-clapping motions from previously unseen participants with a test-set classification accuracy of 97.0%. Baxter uses these sensors and this classifier to quickly identify the motions of its human gameplay partner, so that it can join in hand-clapping games. This system was evaluated by N = 24 naïve users in an experiment that involved learning sequences of eight motions from Baxter, teaching Baxter eight-motion game patterns, and completing a free interaction period. The motion classification accuracy in this less structured setting was 85.9%, primarily due to unexpected variations in motion timing. The quantitative task performance results and qualitative participant survey responses showed that learning games from Baxter was significantly easier than teaching games to Baxter, and that the teaching role caused users to consider more teamwork aspects of the gameplay. Over the course of the experiment, people felt more understood by Baxter and became more willing to follow the example of the robot. Users felt uniformly safe interacting with Baxter, and they expressed positive opinions of Baxter and reported fun interacting with the robot. Taken together, the results indicate that this robot achieved credible social-physical interaction with humans and that its ability to both lead and follow systematically changed the human partner's experience.

[1]  Kazuhiro Kosuge,et al.  Dance partner robot - Ms DanceR , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[2]  Barry A. T. Brown,et al.  Into the wild: challenges and opportunities for field trial methods , 2011, CHI.

[3]  Adriana Tapus,et al.  Haptic Human-Robot Affective Interaction in a Handshaking Social Protocol , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[4]  Takayuki Kanda,et al.  Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial , 2004, Hum. Comput. Interact..

[5]  Ricardo Chavarriaga,et al.  The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition , 2013, Pattern Recognit. Lett..

[6]  Vanessa Evers,et al.  Measuring acceptance of an assistive social robot: a suggested toolkit , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[7]  Tetsuaki Baba,et al.  Video game that uses skin contact as controller input , 2007, SIGGRAPH '07.

[8]  Anil K. Jain,et al.  Statistical Pattern Recognition: A Review , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Cynthia Breazeal,et al.  An Initial Discussion of Timing Considerations Raised During Development of a Magician-Robot Interaction , 2014 .

[10]  Ben J. A. Kröse,et al.  The influence of social presence on enjoyment and intention to use of a robot and screen agent by elderly users , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[11]  C. Stanton,et al.  Teleoperation of a humanoid robot using full-body motion capture , example movements , and machine learning , 2012 .

[12]  H. Schifferstein,et al.  THE TACTUAL EXPERIENCE OF OBJECTS , 2008 .

[13]  Scott R. Klemmer,et al.  How bodies matter: five themes for interaction design , 2006, DIS '06.

[14]  Kyle B. Reed,et al.  Physical Collaboration of Human-Human and Human-Robot Teams , 2008, IEEE Transactions on Haptics.

[15]  Katherine J. Kuchenbecker,et al.  Designing and Assessing Expressive Open-Source Faces for the Baxter Robot , 2016, ICSR.

[16]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[17]  W. Brodsky,et al.  Handclapping songs: a spontaneous platform for child development among 5–10‐year‐old children , 2011 .

[18]  Takashi Minato,et al.  Physical Human-Robot Interaction: Mutual Learning and Adaptation , 2012, IEEE Robotics & Automation Magazine.

[19]  Katherine J. Kuchenbecker,et al.  Using IMU data to demonstrate hand-clapping games to a robot , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[20]  Katherine J. Kuchenbecker,et al.  Equipping the Baxter robot with human-inspired hand-clapping skills , 2016, 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[21]  K. MacLean,et al.  The Haptic Creature Project : Social Human-Robot Interaction through Affective Touch , 2008 .

[22]  Paula Fitzpatrick,et al.  Understanding social motor coordination. , 2011, Human movement science.

[23]  Andrea Lockerd Thomaz,et al.  Touched by a robot: An investigation of subjective responses to robot-initiated touch , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[24]  Seungchul Lee,et al.  High5: promoting interpersonal hand-to-hand touch for vibrant workplace with electrodermal sensor watches , 2014, UbiComp.

[25]  Shigeki Sugano,et al.  Human-robot-contact-state identification based on tactile recognition , 2005, IEEE Transactions on Industrial Electronics.

[26]  Martin Buss,et al.  Haptic Human–Robot Collaboration: Comparison of Robot Partner Implementations in Terms of Human-Likeness and Task Performance , 2011, PRESENCE: Teleoperators and Virtual Environments.

[27]  Billur Barshan,et al.  Human Activity Recognition Using Inertial/Magnetic Sensor Units , 2010, HBU.

[28]  Selma Sabanovic,et al.  Cultural design of domestic robots: A study of user expectations in Korea and the United States , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[29]  Billur Barshan,et al.  Comparative study on classifying human activities with miniature inertial and magnetic sensors , 2010, Pattern Recognit..

[30]  Antonio Bicchi,et al.  An atlas of physical human-robot interaction , 2008 .

[31]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[32]  Takayuki Kanda,et al.  Full-body gesture recognition using inertial sensors for playful interaction with small humanoid robot , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[33]  D. Feil-Seifer,et al.  Defining socially assistive robotics , 2005, 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005..

[34]  Manfred Tscheligi,et al.  A methodological variation for acceptance evaluation of Human-Robot Interaction in public places , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.