An Active Omni-directional Range Sensor forMobile Robot Navigation

Abstract Most autonomous mobile robots view things only in front of them and as a result, they may collide with objects moving from the side or behind. To overcome this problem we have built an Active Omni-directional Range Sensor that can obtain an omni-directional depth data through the use of a laser conic plane and a conic mirror. In the navigation of the mobile robot, the proposed sensor system makes a laser conic plane by rotating the laser point source at high speed which creates a two-dimensional depth map, in real time, once an image is captured. Object recognition is possible using a three-dimensional depth map acquired from the combination of previously obtained two dimensional depth map. Also, since the proposed sensor system measures the actual distance of the target objects, it is able to apply the proposed sensor system to other measurement tasks. The experimental results show that the proposed sensor system has the best potential for object recognition and navigation of a mobile robot in an unknown environment.

[1]  Matthew Turk,et al.  VITS-A Vision System for Autonomous Land Vehicle Navigation , 1988, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Rudolf Kingslake,et al.  Applied Optics and Optical Engineering , 1983 .

[3]  Gabriel Lorimer Miller,et al.  An Optical Rangefinder For Autonomous Robot Cart Navigation , 1987, Other Conferences.

[4]  Edward M. Riseman,et al.  Image-based homing , 1991, IEEE Control Systems.

[5]  Yasushi Yagi,et al.  Obstacle detection with omnidirectional image sensor HyperOmni Vision , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[6]  A. Del Bimbo,et al.  Determination of road directions using feedback neural nets , 1993, Signal Process..

[7]  Phillip J. McKerrow,et al.  Introduction to robotics , 1991 .

[8]  Hiroshi Ishiguro,et al.  Omni-directional stereo for making global map , 1990, [1990] Proceedings Third International Conference on Computer Vision.

[9]  Yasushi Yagi,et al.  Real-time omnidirectional image sensor (COPIS) for vision-guided navigation , 1994, IEEE Trans. Robotics Autom..

[10]  Martial Hebert,et al.  Vision and navigation for the Carnegie-Mellon Navlab , 1988 .

[11]  Marc G. Slack Fixed computation real-time sonar fusion for local navigation , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[12]  Edward M. Riseman,et al.  A Fast Line Finder for Vision-Guided Robot Navigation , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Masashi Yamamoto,et al.  Panoramic representations of scenes around a point , 1991, Proceedings IROS '91:IEEE/RSJ International Workshop on Intelligent Robots and Systems '91.

[14]  Herbert Peremans,et al.  Tri-aural perception on a mobile robot , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[15]  Klaus-Werner Jörg,et al.  Laserradar and sonar based world modeling and motion control for fast obstacle avoidance of the autonomous mobile robot MOBOT-IV , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[16]  Ernest L. Hall,et al.  Dynamic omnidirectional vision for mobile robots , 1986, J. Field Robotics.

[17]  François G. Pin,et al.  Using fuzzy behaviors for the outdoor navigation of a car with low-resolution sensors , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[18]  Jake K. Aggarwal,et al.  Structure from stereo-a review , 1989, IEEE Trans. Syst. Man Cybern..

[19]  Alberto Elfes,et al.  Sonar-based real-world mapping and navigation , 1987, IEEE J. Robotics Autom..

[20]  Hyung Suck Cho,et al.  A sensor-based navigation for a mobile robot using fuzzy logic and reinforcement learning , 1995, IEEE Trans. Syst. Man Cybern..

[21]  Yasushi Yagi,et al.  Map-based navigation for a mobile robot with omnidirectional image sensor COPIS , 1995, IEEE Trans. Robotics Autom..

[22]  E. Freund,et al.  Laser Scanner Based Free Navigation of Autonomous Vehicles , 1993 .