Laser-based people tracking by multiple mobile robots

This paper presents laser-based people tracking by a group of mobile robots located near each other. Each robot finds moving people in its own laser scan images using an occupancy-grid-based method. It then tracks the detected people via Kalman filter and Global-nearest-neighbor-based data association. Tracking data are broadcasted to multiple robots through intercommunication and are combined using the Covariance intersection method. In our tracking method, all the robots share tracking data with each other; hence, they can always recognize people who are invisible to any other robot. The method is validated by the results of an experiment in which two walking people are tracked by three mobile robots.

[1]  Yunhui Liu,et al.  Distributed target tracking with energy consideration using mobile sensor networks , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Tzvetan Semerdjiev,et al.  A study of a target tracking algorithm using global nearest neighbor approach , 2003, CompSysTech '03.

[3]  Evangelos E. Milios,et al.  Robot Pose Estimation in Unknown Environments by Matching 2D Range Scans , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[4]  Rodrigo Ventura,et al.  People Detection and Tracking in a , 2007 .

[5]  Subhash Challa,et al.  Autonomous Vehicles Navigation with Visual Target Tracking: Technical Approaches , 2008, Algorithms.

[6]  Kazuhiko Takahashi,et al.  Moving-Object Tracking with In-Vehicle Multi-Laser Range Sensors , 2008, J. Robotics Mechatronics.

[7]  Kazuhiko Takahashi,et al.  Laser-based tracking of randomly moving people in crowded environments , 2010, 2010 IEEE International Conference on Automation and Logistics.

[8]  Sebastian Thrun,et al.  Online simultaneous localization and mapping with detection and tracking of moving objects: theory and results from a ground vehicle in crowded urban areas , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[9]  Huosheng Hu,et al.  Multisensor-Based Human Detection and Tracking for Mobile Service Robots , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[10]  Gaurav S. Sukhatme,et al.  Cooperative Multi-robot Target Tracking , 2006, DARS.

[11]  Vijay Kumar,et al.  Experimental Testbed for Large Multirobot Teams , 2008, IEEE Robotics Autom. Mag..

[12]  Masafumi Hashimoto,et al.  A Laser Based Multi-Target Tracking for Mobile Robot , 2006, IAS.

[13]  Gaurav S. Sukhatme,et al.  Real-time Motion Tracking from a Mobile Robot , 2010, Int. J. Soc. Robotics.

[14]  Nidhi Kalra,et al.  Market-Based Multirobot Coordination: A Survey and Analysis , 2006, Proceedings of the IEEE.

[15]  Jeffrey K. Uhlmann,et al.  General Decentralized Data Fusion With Covariance Intersection (CI) , 2001 .

[16]  Kazuhiko Takahashi,et al.  Identification and tracking using laser and vision of people maneuvering in crowded environments , 2010, 2010 IEEE International Conference on Systems, Man and Cybernetics.

[17]  Thierry Chateau,et al.  A method based on multilayer laserscanner to detect and track pedestrians in urban environment , 2009, 2009 IEEE Intelligent Vehicles Symposium.

[18]  Weihua Sheng,et al.  Adaptive flocking control for dynamic target tracking in mobile sensor networks , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  Masamichi Shimosaka,et al.  Multiple-Person Tracking by Multiple Cameras and Laser Range Scanners in Indoor Environments , 2010, J. Robotics Mechatronics.

[20]  Y. Bar-Shalom Tracking and data association , 1988 .

[21]  Maja J. Mataric,et al.  A laser-based people tracker , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[22]  R. Shibasaki,et al.  TRACKING PEDESTRIAN BY USING MULTIPLE LASER RANGE SCANNERS , 2004 .

[23]  Masafumi Hashimoto,et al.  Multilayer lidar-based pedestrian tracking in urban environments , 2010, 2010 IEEE Intelligent Vehicles Symposium.