Human-like intelligence requires human-like interactions with the world . ” –

The emergence of robot applications and its growing availability to nontechnical users implies the development of new ways of interaction between this kind of electronic devices and users. Human Robot Interaction (HRI) is a research area about the study of the dynamics involved in the interaction between humans and robots. It involves several knowledge fields such as natural language processing, computer vision, machine learning, electronics and even social sciences like psychology and human communication. HRI aims at the creation of natural interfaces between human and robots which are intuitive and easy to use without previous knowledge or training. The main goal of this Master Thesis is the development of a gestural interface to interact with robots in a similar way as humans do, allowing the user to communicate information beyond linguistic description of the task (non-verbal communication). In order to fulfill this objective, the gesture recognition application has been implemented using the Microsoft’s Kinect v2 sensor. Hence, a real-time algorithm is described to deal with two kinds of gestures which are described; the static gestures and the dynamic ones, being the latter recognized using a weighted Dynamic Time Warping method. Skeletal features are used to define both kinds of gestural sequences, having each gesture its own set of specific features. The Kinect based gesture recognition application has been implemented in a multi-robot case. So, a NAO humanoid robot is in charge to interact with the users and respond to the visual signals they produce. Moreover, a wheeled Wifibot robot carries both the sensor and the NAO robot, easing navigation when necessary. The system is currently able to recognize two gestures, one of each kind (static and dynamic). The dynamic gesture consists in a wave movement which the user salutes the robot; meanwhile the static one is a pointing to an object gesture. When performed, the robot looks for objects near the location which has been pointed, and tries to detect which is the object that the user was referring to, asking him or her about it, if needed. When the object requested by the user is recognized, the robot goes down the wheeled platform, approaches to it and shows it to the user. A broad set of user tests have been carried out demonstrating that the system is, indeed, a natural approach to human robot interaction, with a fast response and easy to use, showing high gesture recognition rates. Possible applications of this kind of systems to household environments are also discussed.

[1]  Katsuhiko Ogata,et al.  Modern Control Engineering , 1970 .

[2]  S. Chiba,et al.  Dynamic programming algorithm optimization for spoken word recognition , 1978 .

[3]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[4]  M. Hecht,et al.  The nonverbal communication reader , 1990 .

[5]  Sebastian Thrun,et al.  A Gesture Based Interface for Human-Robot Interaction , 2000, Auton. Robots.

[6]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Tao Zhang,et al.  Adaptive visual gesture recognition for human-robot interaction using a knowledge-based software platform , 2007, Robotics Auton. Syst..

[8]  B. Gates A robot in every home. , 2007, Scientific American.

[9]  Rainer Stiefelhagen,et al.  Visual recognition of pointing gestures for human-robot interaction , 2007, Image Vis. Comput..

[10]  Elin Anna Topp,et al.  Human-Robot Interaction and Mapping with a Service Robot: Human Augmented Mapping , 2008 .

[11]  Todor Todoroff,et al.  REAL-TIME DTW-BASED GESTURE RECOGNITION EXTERNAL OBJECT FOR MAX/MSP AND PUREDATA , 2009 .

[12]  Peter Robinson,et al.  Cooperative gestures: effective signaling for humanoid robots , 2010, HRI 2010.

[13]  P. Trahanias,et al.  Temporal gesture recognition for human-robot interaction , 2011 .

[14]  Víctor González-Pacheco,et al.  Integration of a low-cost RGB-D sensor in a social robot for gesture recognition , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Sergio Escalera,et al.  Featureweighting in dynamic timewarping for gesture recognition in depth data , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[16]  Seong-Whan Lee,et al.  Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter , 2011, Image Vis. Comput..

[17]  Yee-Pien Yang,et al.  Tracking with pointing gesture recognition for human-robot interaction , 2011, 2011 IEEE/SICE International Symposium on System Integration (SII).

[18]  Luc Van Gool,et al.  Real-time 3D hand gesture interaction with a robot for understanding directions from humans , 2011, 2011 RO-MAN.

[19]  Maria Pateraki,et al.  Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[20]  Jörg Stückler,et al.  Learning to interpret pointing gestures with a time-of-flight camera , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[21]  Kevin O'Brien,et al.  Collaboration with an autonomous humanoid robot: A little gesture goes a long way , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[22]  Luca Maria Gambardella,et al.  Incremental learning using partial feedback for gesture-based human-swarm interaction , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[23]  Tsukasa Ogasawara,et al.  Body gesture classification based on Bag-of-features in frequency domain of motion , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[24]  Hyun Myung,et al.  Gesture recognition algorithm for moving kinect sensor , 2013, 2013 IEEE RO-MAN.

[25]  Raquel Frizera Vassallo,et al.  Human–Robot Interaction and Cooperation Through People Detection and Gesture Recognition , 2013, Journal of Control, Automation and Electrical Systems.

[26]  Tarik Arici,et al.  Robust gesture recognition using feature pre-processing and weighted dynamic time warping , 2014, Multimedia Tools and Applications.

[27]  Thi Thanh Hai Tran How can human communicate with robot by hand gesture? , 2013, 2013 International Conference on Computing, Management and Telecommunications (ComManTel).

[28]  Indira Thouvenin,et al.  Human gesture segmentation based on change point model for efficient gesture interface , 2013, 2013 IEEE RO-MAN.

[29]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[30]  Radu Bogdan Rusu Clustering and Segmentation , 2013 .

[31]  Stefan Wermter,et al.  HandSOM - neural clustering of hand motion for gesture recognition in real time , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[32]  Tatsuya Fujii,et al.  Gesture recognition system for Human-Robot Interaction and its application to robotic service task , 2014 .

[33]  Sergio Escalera,et al.  Probability-based Dynamic Time Warping and Bag-of-Visual-and-Depth-Words for Human Gesture Recognition in RGB-D , 2014, Pattern Recognit. Lett..

[34]  Chen Qian,et al.  Realtime and Robust Hand Tracking from Depth , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[35]  Allison Sauppé,et al.  Robot Deictics: How Gesture and Context Shape Referential Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[36]  Xose Manuel Pardo,et al.  Gesture-based interaction with voice feedback for a tour-guide robot , 2014, J. Vis. Commun. Image Represent..

[37]  Luke S. Zettlemoyer,et al.  Learning from Unscripted Deictic Gesture and Language for Human-Robot Interactions , 2014, AAAI.

[38]  Shobhit Maheshwari,et al.  Hand gesture pointing location detection , 2014 .

[39]  Sukhan Lee,et al.  Kinect based calling gesture recognition for taking order service of elderly care robot , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[40]  Sergio Escalera,et al.  ChaLearn Looking at People Challenge 2014: Dataset and Results , 2014, ECCV Workshops.

[41]  Silvia Rossi,et al.  Continuous gesture recognition for flexible human-robot interaction , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[42]  Stefan Wermter,et al.  Real-time gesture recognition using a humanoid robot with a deep neural architecture , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[43]  Andrew W. Fitzgibbon,et al.  Accurate, Robust, and Flexible Real-time Hand Tracking , 2015, CHI.

[44]  Cecilio Angulo,et al.  Using a cognitive architecture for general purpose service robot control , 2015, Connect. Sci..

[45]  Dan Xu,et al.  Online Dynamic Gesture Recognition for Human Robot Interaction , 2015, J. Intell. Robotic Syst..