LOW-LEVEL SENSOR FUSION-BASED HUMAN TRACKING FOR MOBILE ROBOT

In this paper, a novel sensor-based human tracking method that enables a mobile robot to follow a human with high robustness and responsiveness is presented. The method is based on low-level sensor data fusion combining depth data from a stereo camera and an infrared 2D laser range finder (LRF) to detect target human in the near surrounding of the robot. After initial position of target human is located by sensor fusion-based human detection, a novel tracking algorithm that combines a laser data-based search window and Kalman filter is used to recursively predict and update estimations of target human position in robot’s coordinate system. The use of tracking window contributes to reduction of computational cost by defining region of interest (ROI) enabling so real-time performance. The performance of proposed system was tested in several indoor scenarios. Experimental results show that the proposed human detection algorithm is robust and human tracking algorithm is able to handle fast human movements and keep tracking of target human in various scenarios.

[1]  Guoqiang Peter Zhang,et al.  Neural networks for classification: a survey , 2000, IEEE Trans. Syst. Man Cybern. Part C.

[2]  Li-Chen Fu,et al.  Sensor fusion based human detection and tracking system for human-robot interaction , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  ChangHwan Kim,et al.  A lazy decision approach based on ternary thresholding for robust target object detection , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[4]  Danijela Ristic-Durrant,et al.  Stereo Vision-Based Human Tracking for Robotic Follower , 2013 .

[5]  Ja Choon Koo,et al.  A fast block matching algorthim for stereo correspondence , 2008, 2008 IEEE Conference on Cybernetics and Intelligent Systems.

[6]  Li-Chen Fu,et al.  Multi-robot cooperation based human tracking system using Laser Range Finder , 2011, 2011 IEEE International Conference on Robotics and Automation.

[7]  J. L. Roux An Introduction to the Kalman Filter , 2003 .

[8]  S. Y. Chen,et al.  Kalman Filter for Robot Vision: A Survey , 2012, IEEE Transactions on Industrial Electronics.

[9]  Fawzi Nashashibi,et al.  Fast Pedestrian Detection in Dense Environment with a Laser Scanner and a Camera , 2009, VTC Spring 2009 - IEEE 69th Vehicular Technology Conference.

[10]  Roland Siegwart,et al.  Multimodal People Detection and Tracking in Crowded Scenes , 2008, AAAI.

[11]  Ming-Kuei Hu,et al.  Visual pattern recognition by moment invariants , 1962, IRE Trans. Inf. Theory.

[12]  Danijela Ristic-Durrant,et al.  Robust stereo-vision based 3D modelling of real-world objects for assistive robotic applications , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Roland Siegwart,et al.  Introduction to Autonomous Mobile Robots , 2004 .

[14]  Ramsey Michael Faragher,et al.  Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation [Lecture Notes] , 2012, IEEE Signal Processing Magazine.

[15]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .