3D robot sensing from sonar and vision

Describes a sensor that fuses sonar and visual data to create a three dimensional (3D) model of the environment with application to robot navigation. The environment is characterized by a set of connected horizontal and vertical lines. 3D sonar data is augmented by making deductions concerning the connection and definition of lines in 2D visual data. Any errors that may result from incorrect interpretation of the 2D camera data, such as false connections between lines, can be detected by moving the robot. Experimental results from the sensor are presented.

[1]  Larry H. Matthies,et al.  Integration of sonar and stereo range data using a grid-based representation , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[2]  Martin Beckerman,et al.  A Bayes-maximum entropy method for multi-sensor data fusion , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[3]  Lindsay Kleeman,et al.  A sonar sensor for accurate 3D target localisation and classification , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[4]  Martin Beckerman,et al.  Design and Implementation of Two Concurrent Multi-Sensor Integration Algorithms for Mobile Robotsl , 1990, Other Conferences.

[5]  Lindsay Kleeman,et al.  Mobile Robot Sonar for Target Localization and Classification , 1995, Int. J. Robotics Res..

[6]  Mongi A. Abidi,et al.  Data fusion in robotics and machine intelligence , 1992 .

[7]  Lindsay Kleeman,et al.  An optimal sonar array for target localization and classification , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.