Human gesture recognition performance evaluation for service robots

Intelligent service robots use gesture recognition technology that utilizes the MS Kinect sensor to facilitate natural interactions between humans and robots. To evaluate gesture recognition performance in a real-life environment, we constructed a new gesture database that takes into account cluttered backgrounds, various distances and poses, and movement of robots, and then we evaluated the gesture recognition performance of commercial robots. In this paper, we seek to help consumers, robot manufacturers, and gesture recognition engine developers provide comparable results for the gesture recognition capabilities of service robots.

[1]  Sergio Escalera,et al.  ChaLearn Looking at People 2015 challenges: Action spotting and cultural event recognition , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[2]  Helena M. Mentis,et al.  Instructing people for training gestural interactive systems , 2012, CHI.

[3]  Hema Swetha Koppula,et al.  Learning human activities and object affordances from RGB-D videos , 2012, Int. J. Robotics Res..

[4]  Luca Iocchi,et al.  RoboCup@Home: Scientific Competition and Benchmarking for Domestic Service Robots , 2009 .

[5]  Seong-Whan Lee,et al.  Gesture Spotting and Recognition for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.

[6]  Seong-Whan Lee Automatic gesture recognition for intelligent human-robot interaction , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[7]  Bart Selman,et al.  Unstructured human activity detection from RGBD images , 2011, 2012 IEEE International Conference on Robotics and Automation.

[8]  Bingbing Ni,et al.  RGBD-HuDaAct: A color-depth video database for human daily activity recognition , 2011, ICCV Workshops.

[9]  B. Watanapa,et al.  Human gesture recognition using Kinect camera , 2012, 2012 Ninth International Conference on Computer Science and Software Engineering (JCSSE).