Human-oriented recognition for intelligent interactive office robot

This paper presents our new intelligent interactive robot, which is constructed to eagerly provide multi-functional services in an office environment. In order to endow a full interactive capability of our robots for realizing so-called human-robot interaction (HRI) , we propose sensor fusion based human detection and tracking system and human pose estimation to deal with a number of situations which may take place in the office environment. Not only by these perceptions, human interact with the robot also by some natural way, such as touching the interface screen and talking with the robot through microphone. Finally, the effectiveness of the proposed work is tested and validated by some of experiments.

[1]  Bill Triggs,et al.  Histograms of oriented gradients for human detection , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[2]  Junji Hirai,et al.  Design of a PID controller based on H∞ loop shaping method using frequency responses , 2013, 2013 13th International Conference on Control, Automation and Systems (ICCAS 2013).

[3]  Li-Chen Fu,et al.  Multi-robot cooperation based human tracking system using Laser Range Finder , 2011, 2011 IEEE International Conference on Robotics and Automation.

[4]  Li-Chen Fu,et al.  Visual tracking of human head and arms with a single camera , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Toby Sharp,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR.

[6]  Anders Green,et al.  Involving users in the design of a mobile office robot , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[7]  Li-Chen Fu,et al.  Multiple People Visual Tracking in a Multi-Camera System for Cluttered Environments , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Li-Chen Fu,et al.  A hybrid approach to RBPF based SLAM with grid mapping enhanced by line matching , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Rachid Alami,et al.  A Human Aware Mobile Robot Motion Planner , 2007, IEEE Transactions on Robotics.

[10]  James J. Little,et al.  Simultaneous Tracking and Action Recognition using the PCA-HOG Descriptor , 2006, The 3rd Canadian Conference on Computer and Robot Vision (CRV'06).

[11]  Li-Chen Fu,et al.  Human-Centered Robot Navigation—Towards a Harmoniously Human–Robot Coexisting Environment , 2011, IEEE Transactions on Robotics.

[12]  Rachid Alami,et al.  Diligent: towards a human-friendly navigation system , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[13]  Takayuki Kanda,et al.  Navigation for human-robot interaction tasks , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[14]  BlakeAndrew,et al.  Real-time human pose recognition in parts from single depth images , 2013 .

[15]  E. Hall,et al.  The Hidden Dimension , 1970 .

[16]  Václav Hlavác,et al.  Pose primitive based human action recognition in videos or still images , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.