Real Time Biologically-Inspired Depth Maps from Spherical Flow

We present a strategy for generating real-time relative depth maps of an environment from optical flow, under general motion. We achieve this using an insect-inspired hemispherical fish-eye sensor with 190 degree FOV, and a de-rotated optical flow field. The de-rotation algorithm applied is based on the theoretical work of Nelson and Aloimonos (1988). From this we obtain the translational component of motion, and construct full relative depth maps on the sphere. We examine the robustness of this strategy in both simulation and real-world experiments, for a variety of environmental scenarios. To our knowledge, this is the first demonstrated implementation of the Nelson and Aloimonos algorithm working in real-time, over real image sequences. In addition, we apply this algorithm to the real-time recovery of full relative depth maps. These preliminary results demonstrate the feasibility of this approach for closed-loop control of a robot.

[1]  Tomaso A. Poggio,et al.  Motion Field and Optical Flow: Qualitative Properties , 1989, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Giulio Sandini,et al.  Uncalibrated obstacle detection using normal flow , 1996 .

[3]  Giulio Sandini,et al.  Visual Behaviors for Docking , 1997, Comput. Vis. Image Underst..

[4]  Yiannis Aloimonos,et al.  Directions of Motion Fields are Hardly Ever Ambiguous , 2004, International Journal of Computer Vision.

[5]  Martin Herman,et al.  Real-time single-workstation obstacle avoidance using only wide-field flow divergence , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[6]  M V Srinivasan,et al.  How insects infer range from visual motion. , 1993, Reviews of oculomotor research.

[7]  Thomas S. Huang,et al.  Theory of Reconstruction from Image Motion , 1992 .

[8]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[9]  Eero P. Simoncelli Design of multi-dimensional derivative filters , 1994, Proceedings of 1st International Conference on Image Processing.

[10]  J. Aloimonos,et al.  Finding motion parameters from spherical motion fields (or the advantages of having eyes in the back of your head) , 1988, Biological Cybernetics.

[11]  Nick Barnes,et al.  A Robust Docking Strategy for a Mobile Robot Using Flow Field Divergence , 2006, IEEE Transactions on Robotics.

[12]  Gilad Adiv,et al.  Inherent Ambiguities in Recovering 3-D Motion and Structure from a Noisy Flow Field , 1989, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Giulio Sandini,et al.  Uncalibrated obstacle detection using normal flow , 2005, Machine Vision and Applications.

[14]  Nick Barnes,et al.  Performance of optical flow techniques for indoor navigation with a mobile robot , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[15]  Giulio Sandini,et al.  Divergent stereo in autonomous navigation: From bees to robots , 1995, International Journal of Computer Vision.

[16]  F. A. Miles,et al.  Visual Motion and Its Role in the Stabilization of Gaze , 1992 .

[17]  J H Rieger,et al.  Processing differential image motion. , 1985, Journal of the Optical Society of America. A, Optics and image science.

[18]  M. Srinivasan,et al.  Visual motor computations in insects. , 2004, Annual review of neuroscience.

[19]  Yiannis Aloimonos,et al.  Geometry of Eye Design: Biology and Technology , 2000, Theoretical Foundations of Computer Vision.

[20]  H. C. Longuet-Higgins,et al.  The interpretation of a moving retinal image , 1980, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[21]  Hans-Hellmut Nagel,et al.  The coupling of rotation and translation in motion estimation of planar surfaces , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[22]  M. Srinivasan,et al.  Range estimation with a panoramic visual sensor , 1997 .