Space and time sensor fusion using an active camera for mobile robot navigation

We propose a sensor-fusion technique where the data sets for previous moments are properly transformed and fused into the current data sets to allow accurate measurements, such as the distance to an obstacle or the location of the service robot itself. In conventional fusion schemes, measurements are dependent on the current data sets. As a result, more sensors are required to measure a certain physical parameter or to improve the accuracy of a measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequences of the data sets are stored and utilized to improve the measurements. The theoretical basis is illustrated by examples, and the effectiveness is proved through simulations. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment and a structured environment.

[1]  Jae Weon Choi,et al.  Fine active calibration of camera position/orientation through pattern recognition , 1998, IEEE International Symposium on Industrial Electronics. Proceedings. ISIE'98 (Cat. No.98TH8357).

[2]  James Llinas,et al.  Multisensor Data Fusion , 1990 .

[3]  Sebastian Thrun,et al.  A Bayesian Approach to Landmark Discovery and Active Perception in Mobile Robot Navigation , 1999 .

[4]  Peter Weckesser,et al.  Navigating a mobile service-robot in a natural environment using sensor-fusion techniques , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[5]  Moshe Kam,et al.  Sensor Fusion for Mobile Robot Navigation , 1997, Proc. IEEE.

[6]  Ren C. Luo,et al.  A review of high-level multisensor fusion: approaches and applications , 1999, Proceedings. 1999 IEEE/SICE/RSJ. International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI'99 (Cat. No.99TH8480).