A Framework for User-Defined Body Gestures to Control a Humanoid Robot

This paper presents a framework that allows users to interact with and navigate a humanoid robot using body gestures. The first part of the paper describes a study to define intuitive gestures for eleven navigational commands based on analyzing 385 gestures performed by 35 participants. From the study results, we present a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, and time performances of the gesture motions. The second part of the paper presents a full body interaction system for recognizing the user-defined gestures. We evaluate the system by recruiting 22 participants to test for the accuracy of the proposed system. The results show that most of the defined gestures can be successfully recognized with a precision between 86$$-$$-100 % and an accuracy between 73$$-$$-96 %. We discuss the limitations of the system and present future work improvements.

[1]  Fumio Harashima,et al.  Natural Interface Using Pointing Behavior for Human–Robot Gestural Interaction , 2007, IEEE Transactions on Industrial Electronics.

[2]  Dirk Schulz,et al.  Real time interaction with mobile robots using hand gestures , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[3]  Albert A. Rizzo,et al.  FAAST: The Flexible Action and Articulated Skeleton Toolkit , 2011, 2011 IEEE Virtual Reality Conference.

[4]  Fumio Harashima,et al.  Humatronics (1) - natural interaction between human and networked robot using human motion recognition , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Riccardo Scateni,et al.  Gestural Interaction for Robot Motion Control , 2011, Eurographics Italian Chapter Conference.

[6]  Sotaro Kita,et al.  Cross-cultural variation of speech-accompanying gesture: A review , 2009, Speech Accompanying-Gesture.

[7]  John-John Cabibihan,et al.  Human-Recognizable Robotic Gestures , 2012, IEEE Transactions on Autonomous Mental Development.

[8]  Sung Kwan Kang,et al.  Color Based Hand and Finger Detection Technology for User Interaction , 2008, 2008 International Conference on Convergence and Hybrid Information Technology.

[9]  C. Breazeal,et al.  Robots that imitate humans , 2002, Trends in Cognitive Sciences.

[10]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[11]  Elisabeth André,et al.  Studying user-defined iPad gestures for interaction in multi-display environment , 2012, IUI '12.

[12]  D. McNeill Hand and Mind: What Gestures Reveal about Thought , 1992 .

[13]  Stefan Kopp,et al.  Generation and Evaluation of Communicative Robot Gesture , 2012, Int. J. Soc. Robotics.

[14]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[15]  Kikuo Fujimura,et al.  The intelligent ASIMO: system overview and integration , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Munsang Kim,et al.  Easy Interface and Control of Tele-education Robots , 2013, Int. J. Soc. Robotics.

[17]  Sebastian Thrun,et al.  A Gesture Based Interface for Human-Robot Interaction , 2000, Auton. Robots.

[18]  Yael Edan,et al.  Dynamic gesture vocabulary design for intuitive human-robot dialog , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Tatsuya Nomura,et al.  Cultural Differences in Attitudes Towards Robots , 2005 .

[20]  Stefan Kopp,et al.  A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction , 2011, 2011 RO-MAN.

[21]  Daniel Stonier,et al.  A New Approach for Human-Robot Interaction Using Human Body Language , 2011, ICHIT.

[22]  Alin Albu-Schäffer,et al.  A human-centered approach to robot gesture based communication within collaborative working processes , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[23]  D. McNeill Gesture and Thought , 2005 .

[24]  Honghai Liu,et al.  Recognizing Hand Grasp and Manipulation Through Empirical Copula , 2010, Int. J. Soc. Robotics.

[25]  Karon E. MacLean,et al.  Gestures for industry Intuitive human-robot communication from human observation , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[26]  Tatsuya Nomura,et al.  What People Assume about Humanoid and Animal-Type Robots: Cross-Cultural Analysis between Japan, Korea, and the United States , 2008, Int. J. Humanoid Robotics.

[27]  Tatsuya Nomura,et al.  The influence of people’s culture and prior experiences with Aibo on their attitude towards robots , 2006, AI & SOCIETY.

[28]  Elisabeth André,et al.  Studies on Grounding with Gaze and Pointing Gestures in Human-Robot-Interaction , 2012, ICSR.

[29]  Jörg Illmann,et al.  Using spatial context knowledge in gesture recognition for commanding a domestic service robot , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[30]  Rüdiger Dillmann,et al.  Teaching and learning of robot tasks via observation of human performance , 2004, Robotics Auton. Syst..

[31]  D. McNeill So you think gestures are nonverbal , 1985 .

[32]  Alexander H. Waibel,et al.  Natural human-robot interaction using speech, head pose and gestures , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[33]  Dan Saffer Designing gestural interfaces , 2009 .

[34]  Sonia Chernova,et al.  Humanoid robot control using depth camera , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[35]  Zhe Zhang,et al.  Human Body Pose Interpretation and Classification for Social Human-Robot Interaction , 2011, Int. J. Soc. Robotics.

[36]  Selma Sabanovic,et al.  Evaluation of control factors affecting the operator's immersion and performance in robotic teleoperation , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[37]  Qiang Huang,et al.  A teleoperation system for a humanoid robot with multiple information feedback and operational modes , 2005, 2005 IEEE International Conference on Robotics and Biomimetics - ROBIO.

[38]  Michael Rohs,et al.  User-defined gestures for connecting mobile phones, public displays, and tabletops , 2010, Mobile HCI.

[39]  Tohru Matsumoto,et al.  Mimicking and Evaluating Human Motion to Improve the Imitation Skill of Children with Autism Through a Robot , 2011, Int. J. Soc. Robotics.

[40]  Naoyuki Kubota,et al.  Cognitive Development in Partner Robots for Information Support to Elderly People , 2011, IEEE Transactions on Autonomous Mental Development.

[41]  Stefan Kopp,et al.  To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability , 2013, International Journal of Social Robotics.

[42]  J. P. Foley,et al.  Gesture and Environment , 1942 .

[43]  Neil M. Robertson,et al.  A proposed gesture set for the control of industrial collaborative robots , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[44]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[45]  Kazuhito Yokoi,et al.  Whole body teleoperation of a humanoid robot - development of a simple master device using joysticks , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[46]  P. Ekman,et al.  The Repertoire of Nonverbal Behavior: Categories, Origins, Usage, and Coding , 1969 .

[47]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[48]  Ionut Damian,et al.  Natural interaction with culturally adaptive virtual characters , 2012, Journal on Multimodal User Interfaces.

[49]  Peter Xiaoping Liu,et al.  Visual gesture recognition for human-machine interface of robot teleoperation , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).