Memory-based self-localization using omnidirectional images

This paper proposes a new self-localization method using an omnidirectional image sensor which can observe a surrounding environment with 360-degree of view. The method extracts information which is identical for the position of a sensor and invariant against the rotation of the sensor by generating an autocorrelation image from an observed omnidirectional image. The location of the sensor is estimated by evaluating the similarity among the autocorrelation image of an observed image and stored autocorrelation images. The similarity of autocorrelation images is evaluated in low dimensional eigenspaces generated with stored autocorrelation images. We have conducted experiments with real images and examined the performance of the proposed method. The results show that accurate and robust estimation of the sensor's position is possible with our method.

[1]  Shigeaki Watanabe,et al.  Subspace method to pattern recognition , 1973 .

[2]  Cesare Furlanello,et al.  Memory-Based Navigation , 1993, IJCAI.

[3]  S. Katagiri,et al.  Discriminative Subspace Method for Minimum Error Pattern Recognition , 1995, Proceedings of 1995 IEEE Workshop on Neural Networks for Signal Processing.

[4]  Yasushi Yagi,et al.  Obstacle detection with omnidirectional image sensor HyperOmni Vision , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[5]  Yoshiaki Itoh,et al.  Effect of time-spatial size of motion image for localization by using the spotting method , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[6]  Yoshiaki Shirai,et al.  Mobile robot localization based on eigenspace analysis , 1997, Systems and Computers in Japan.