Give me a hand — How users ask a robotic arm for help with gestures

A task that requires two hands to perform such as soldering usually needs additional tools for holding (e.g. a cable) or adding (e.g. solder) an object to a specific position. A robotic manipulator or robotic arm is one of the solution for this requirement. When gesture is selected as a method for controlling a robot, characteristics of gestures are needed for designing and developing a gesture recognition system. With this requirement, we conducted an experiment to obtain a set of user-defined gestures in the soldering task to find out properties and patterns of a gesture for future development of our research. 152 gestures were collected from 19 participants by presenting the “effect” of the gesture (robotic arm movement), and then asking the participants to perform its “cause” (a user-defined gesture). The analyzed data shows that hands are the most used body parts even they are occupied by the task, that one-hand and two-hands gestures were used interchangeably by the participants, that the majority of the participants performed reversible gestures for reversible movements, and that the participants were expecting for better recognition performance on an easier to plan gesture. Our finding can be useful as a guideline for creating gesture set and system for controlling robotic arms based on natural behavior of users.

[1]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[2]  G. Vijayabaskar,et al.  Human-Oriented Interaction with an Anthropomorphic Robot , 2013 .

[3]  Peter Wittenburg,et al.  Annotation by Category: ELAN and ISO DCR , 2008, LREC.

[4]  John J. Craig Zhu,et al.  Introduction to robotics mechanics and control , 1991 .

[5]  L. Leifer,et al.  Clinical evaluation of a desktop robotic assistant. , 1989, Journal of rehabilitation research and development.

[6]  Alexander H. Waibel,et al.  Natural human-robot interaction using speech, head pose and gestures , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[7]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[8]  Tadashi Yoshidome,et al.  Cooperative welfare robot system using hand gesture instructions , 2004 .

[9]  Harry Hochheiser,et al.  Research Methods for Human-Computer Interaction , 2008 .

[10]  Cynthia Breazeal,et al.  Working with robots and objects: revisiting deictic reference for achieving spatial common ground , 2006, HRI '06.

[11]  John J. Craig,et al.  Introduction to Robotics Mechanics and Control , 1986 .

[12]  Alin Albu-Schäffer,et al.  A human-centered approach to robot gesture based communication within collaborative working processes , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Jianwei Zhang,et al.  A two-arm situated artificial communicator for human-robot cooperative assembly , 2003, IEEE Trans. Ind. Electron..

[14]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[15]  Rachid Alami,et al.  A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[16]  Advait Jain,et al.  EL-E: an assistive mobile manipulator that autonomously fetches objects from flat surfaces , 2010, Auton. Robots.

[17]  Alois Knoll,et al.  Interacting in time and space: Investigating human-human and human-robot joint action , 2010, 19th International Symposium in Robot and Human Interactive Communication.