Gesture recognition system for real-time mobile robot control based on inertial sensors and motion strings

Abstract Navigating and controlling a mobile robot in an indoor or outdoor environment by using a range of body-worn sensors is becoming an increasingly interesting research area in the robotics community. In such scenarios, hand gestures offer some unique capabilities for human–robot interaction inherent to nonverbal communication with features and application scenarios not possible with the currently predominant vision-based systems. Therefore, in this paper, we propose and develop an effective inertial-sensor-based system, worn by the user, along with a microprocessor and wireless module for communication with the robot at distances of up to 250 m. Possible features describing hand-gesture dynamics are introduced and their feasibility is demonstrated in an off-line scenario by using several classification methods (e.g., random forests and artificial neural networks). Refined motion features are then used in K-means unsupervised clustering for motion primitive extraction, which forms the motion strings used for real-time classification. The system demonstrated an F 1 score of 90 . 05 % with the possibility of gesture spotting and null class classification (e.g., undefined gestures were discarded from the analysis). Finally, to demonstrate the feasibility of the proposed algorithm, it was implemented in an Arduino-based 8 -bit ATmega2560 microcontroller for control of a mobile, tracked robot platform.

[1]  Mi Zhang,et al.  A feature selection-based framework for human activity recognition using wearable multimodal sensors , 2011, BODYNETS.

[2]  Chin-Boon Chng,et al.  Hand gesture guided robot-assisted surgery based on a direct augmented reality interface , 2014, Comput. Methods Programs Biomed..

[3]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[4]  Tao Zhang,et al.  Adaptive visual gesture recognition for human-robot interaction using a knowledge-based software platform , 2007, Robotics Auton. Syst..

[5]  Rainer Stiefelhagen,et al.  Visual recognition of pointing gestures for human-robot interaction , 2007, Image Vis. Comput..

[6]  Joseph A. Paradiso,et al.  An Inertial Measurement Framework for Gesture Recognition and Applications , 2001, Gesture Workshop.

[7]  Gerhard Tröster,et al.  Gestures are strings: efficient online gesture spotting and classification using string matching , 2007, BODYNETS.

[8]  Bernt Schiele,et al.  Scalable Recognition of Daily Activities with Wearable Sensors , 2007, LoCA.

[9]  Mi Zhang,et al.  Manifold Learning and Recognition of Human Activity Using Body-Area Sensors , 2011, 2011 10th International Conference on Machine Learning and Applications and Workshops.

[10]  C. Morand,et al.  Controlling a Mobile Robot with Natural Commands based on Voice and Gesture , 2013, ICRA 2013.

[11]  Mi Zhang,et al.  Motion primitive-based human activity recognition using a bag-of-features approach , 2012, IHI '12.

[12]  Christopher Assad,et al.  Gesture-based robot control with variable autonomy from the JPL BioSleeve , 2013, 2013 IEEE International Conference on Robotics and Automation.

[13]  Huseyin Atakan Varol,et al.  Real-Time Gesture Recognition for the High-Level Teleoperation Interface of a Mobile Manipulator , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Bernhard Schölkopf,et al.  Probabilistic movement modeling for intention inference in human–robot interaction , 2013, Int. J. Robotics Res..

[15]  M. Asyraf Azman,et al.  Real-time Hand Gestures System for Mobile Robots Control , 2012 .

[16]  Ahesh,et al.  Hand Gesture Based Robot Control using MEMS and ARM-9 A , 2014 .

[17]  Peter Robinson,et al.  Cooperative gestures: effective signaling for humanoid robots , 2010, HRI 2010.

[18]  Mirjana Bonković,et al.  International Journal of Advanced Robotic Systems New Kinematic Parameters for Quantifying Irregularities in the Human and Humanoid Robot Gait Regular Paper , 2022 .

[19]  Anand Kannan,et al.  Intelligent System for Human Computer Interface Using Hand Gesture Recognition , 2012 .

[20]  Yael Edan,et al.  Advanced methods for displays and remote control of robots. , 2011, Applied ergonomics.

[21]  Michael J. A. Berry,et al.  Data mining techniques - for marketing, sales, and customer support , 1997, Wiley computer publishing.

[22]  Nasser Kehtarnavaz,et al.  Fusion of Inertial and Depth Sensor Data for Robust Hand Gesture Recognition , 2014, IEEE Sensors Journal.

[23]  Saso Koceski,et al.  Vision-based gesture recognition for human-computer interaction and mobile robot's freight ramp control , 2010, Proceedings of the ITI 2010, 32nd International Conference on Information Technology Interfaces.

[24]  Rafael Mira De Oliveira Libardi,et al.  Design and evaluation case study: evaluating the kinect device in the task of natural interaction in a visualization system , 2014 .

[25]  Dan Xu,et al.  Online Dynamic Gesture Recognition for Human Robot Interaction , 2015, J. Intell. Robotic Syst..

[26]  Tatsuya Fujii,et al.  Gesture recognition system for Human-Robot Interaction and its application to robotic service task , 2014 .

[27]  Chin-Shyurng Fahn,et al.  A Human-Machine Interaction Technique: Hand Gesture Recognition Based on Hidden Markov Models with Trajectory of Hand Motion , 2011 .

[28]  Tanja Schultz,et al.  Airwriting: a wearable handwriting recognition system , 2013, Personal and Ubiquitous Computing.

[29]  Ruize Xu,et al.  MEMS Accelerometer Based Nonspecific-User Hand Gesture Recognition , 2012, IEEE Sensors Journal.

[30]  Vladimir I. Levenshtein,et al.  Binary codes capable of correcting deletions, insertions, and reversals , 1965 .

[31]  Jun Zhang,et al.  Hand Motion-Based Remote Control Interface with Vibrotactile Feedback for Home Robots , 2013 .

[32]  I M Anonymous Towards a One-Way American Sign Language Translator , .

[33]  Ana M. Bernardos,et al.  Gesture Recognition Using Mobile Phone's Inertial Sensors , 2012, DCAI.

[34]  Sebastian Thrun,et al.  A Gesture Based Interface for Human-Robot Interaction , 2000, Auton. Robots.

[35]  Paul Lukowicz,et al.  Gesture spotting with body-worn inertial sensors to detect user activities , 2008, Pattern Recognit..

[36]  Ming Xie,et al.  Finger identification and hand posture recognition for human-robot interaction , 2007, Image Vis. Comput..

[37]  Francisco Sandoval Hernández,et al.  Fast gesture recognition based on a two-level representation , 2009, Pattern Recognit. Lett..

[38]  Jun Nakanishi,et al.  Dynamical Movement Primitives: Learning Attractor Models for Motor Behaviors , 2013, Neural Computation.

[39]  Andreas Krause,et al.  Unsupervised, dynamic identification of physiological and activity context in wearable computing , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[40]  Hassan Ghasemzadeh,et al.  A phonological expression for physical movement monitoring in body sensor networks , 2008, 2008 5th IEEE International Conference on Mobile Ad Hoc and Sensor Systems.

[41]  Michael Vande Weghe,et al.  An architecture for gesture-based control of mobile robots , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[42]  Seong-Whan Lee,et al.  Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter , 2011, Image Vis. Comput..

[43]  Mubarak Shah,et al.  Discovering Motion Primitives for Unsupervised Grouping and One-Shot Learning of Human Actions, Gestures, and Expressions , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[44]  Hassan Ghasemzadeh,et al.  An automatic segmentation technique in body sensor networks based on signal energy , 2009, BODYNETS.