Real-time obstacle avoidance using central flow divergence and peripheral flow

The lure of using motion vision as a fundamental element in the perception of space drives this effort to use flow features as the sole cues for robot mobility. Real-time estimates of image flow and flow divergence provide the robot's sense of space. The robot steers down a conceptual corridor comparing left and right peripheral flows. Large central flow divergence warns the robot of impending collisions at "dead ends." When this occurs, the robot turns around and resumes wandering. Behavior is generated by directly using flow-based information in the 2D image sequence; no 3D reconstruction is attempted. Active mechanical gate stabilization simplifies the visual interpretation problems by reducing camera rotation. By combining corridor following and dead-end deflection, the robot has wandered around the lab at 30 cm/s for as long as 20 minutes without collision. The ability to support this behavior in real-time with current equipment promises expanded capabilities as computational power increases in the future.<<ETX>>

[1]  Henry Schneiderman,et al.  A discriminating feature tracker for vision-based autonomous driving , 1994, IEEE Trans. Robotics Autom..

[2]  William H. Warren,et al.  Robot navigation from a Gibsonian viewpoint , 1994, Proceedings of IEEE International Conference on Systems, Man and Cybernetics.

[3]  Marilyn Nashman,et al.  Visual Tracking for Autonomous Driving , 1993 .

[4]  Daniel Raviv,et al.  Visual Servoing Using Relevant 2-D Image Cues , 1993, Proceedings of the Intelligent Vehicles '93 Symposium.

[5]  Karen Roberts,et al.  Centering behavior using peripheral vision , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[6]  Giulio Sandini,et al.  Divergent stereo for robot navigation: learning from bees , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[7]  Giulio Sandini,et al.  On the Advantages of Polar and Log-Polar Mapping for Direct Estimation of Time-To-Impact from Optical Flow , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Martin Herman,et al.  New visual invariants for obstacle detection using optical flow induced from general motion , 1992, [1992] Proceedings IEEE Workshop on Applications of Computer Vision.

[9]  Dana H. Ballard,et al.  Principles of animate vision , 1992, CVGIP Image Underst..

[10]  James S. Albus,et al.  Outline for a theory of intelligence , 1991, IEEE Trans. Syst. Man Cybern..

[11]  Yiannis Aloimonos,et al.  Obstacle Avoidance Using Flow Field Divergence , 1989, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  R. Offereins Book review: Digital control system analysis and design , 1985 .

[13]  Ronald Lumia,et al.  PIPE (Pipelined Image-Processing Engine) , 1985, J. Parallel Distributed Comput..

[14]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[15]  Charles L. Lawson,et al.  Solving least squares problems , 1976, Classics in applied mathematics.

[16]  Andrew H. Jazwinski,et al.  Adaptive filtering , 1969, Autom..

[17]  G. Wittum,et al.  Adaptive filtering , 1997 .

[18]  Leslie Pack Kaelbling,et al.  Ecological Robotics: Controlling Behavior with Optical Flow , 1995 .

[19]  Ian Horswill,et al.  Specialization of perceptual processes , 1993 .

[20]  J Fiala,et al.  Note on NASREM Implementation , 1990 .

[21]  Alessandro Verri,et al.  Against Quantitative Optical Flow , 1987 .

[22]  Graham C. Goodwin,et al.  Adaptive filtering prediction and control , 1984 .

[23]  Naresh K. Sinha,et al.  Modern Control Systems , 1981, IEEE Transactions on Systems, Man, and Cybernetics.

[24]  G. Bierman Factorization methods for discrete sequential estimation , 1977 .

[25]  Azriel Rosenfeld,et al.  Digital Picture Processing , 1976 .