Robot navigation using simple sensor fusion
暂无分享,去创建一个
Sensors on an autonomous mobile system are essential in enviornment determination for navigation purposes. As is well documented in previous publications, sonar sensors are inadequate in providing a depiction of a real world environment and therefore do not provide accurate information for navigation, it not used in conjunction with another type of sensor. This paper describes a simple, inexpensive, and relatively fast navigation algorithm involving vision and sonar sensor fusion for use in navigating an autonomous robot in an unknown and potentially dynamic environment. Navigation of the mobile robot was accomplished by use of a TV camera as the primary sensor. Input data received from the camera were digitized through a video module and then processed using a dedicated vision system to enable detection of obstacles and to determine edge positions relative to the robot. Since 3D vision was not attempted due to its complex and time consuming nature, sonar sensors were then sued as secondary sensors in order to determine the proximity of detected obstacles. By then fusing the sensor data, the robot was able to navigate (quickly and collision free) to a given goal, achieving obstacle avoidance in real-time.