Integrating acoustical and optical sensory data for mobile robots

This paper presents a new approach for a mobile robot that uses sonar and vision sensing to recognize indoor scenes. The complementary nature of these sensing modalities can provide complete information about the observed world, which is not available if either sensor is used alone. This approach is based on using some simple rules rather than mathematical models to integrate acoustical and optical sensory data for building a coherent representation of the robot's environment.