Velocity and disparity cues for robust real-time binocular tracking

We have designed and implemented a real-time binocular tracking system which uses two independent cues commonly found in the primary functions of biological visual systems to robustly track moving targets in complex environments, without a-priori knowledge of the target shape or texture: a fast optical flow segmentation algorithm quickly locates independently moving objects for target acquisition and provides a reliable velocity estimate for smooth tracking. In parallel, target position is generated from the output of a zero-disparity filter where a phase-based disparity estimation technique allows dynamic control of the camera vergence do adapt the horopter geometry to the target location. The system takes advantage of the optical properties of our custom-designed foveated wide-angle lenses, which exhibit a wide field of view along with a high resolution fovea. Methods to cope with the distortions introduced by the space-variant resolution, and a robust real-time implementation on a high performance active vision head are presented.

[1]  Yasuo Kuniyoshi,et al.  A foveated wide angle lens for active vision , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[2]  Albert J. Wavering,et al.  TRICLOPS: a high-performance trinocular active vision system , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[3]  John K. Tsotsos,et al.  Techniques for disparity measurement , 1991, CVGIP Image Underst..

[4]  Yasuo Kuniyoshi,et al.  Learning of oculo-motor control: a prelude to robotic imitation , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[5]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[6]  Christopher M. Brown,et al.  Real-time smooth pursuit tracking for a moving binocular robot , 1992, Proceedings 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[7]  Keith Langley,et al.  Recursive Filters for Optical Flow , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Benjamin B. Bederson,et al.  A miniaturized active vision system , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol. IV. Conference D: Architectures for Vision and Pattern Recognition,.

[9]  Carl-Johan Westelius,et al.  Focus of attention and gaze control for robot vision , 1995 .

[10]  Yiannis Aloimonos,et al.  Active vision , 2004, International Journal of Computer Vision.

[11]  Dana H. Ballard,et al.  Animate Vision , 1991, Artif. Intell..

[12]  Ian D. Reid,et al.  Saccade and pursuit on an active head/eye platform , 1994, Image Vis. Comput..

[13]  Yasuo Kuniyoshi,et al.  Calibration of a foveated wide-angle lens on an active vision head , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[14]  Jan-Olof Eklundh,et al.  Integrating primary ocular processes , 1992, Image Vis. Comput..

[15]  Rodney A. Brooks,et al.  Behavior-based humanoid robotics , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[16]  Atsuto Maki,et al.  Disparity Selection in Binocular Pursuit , 1994, MVA.

[17]  Active Stereo Vision System with Foveated Wide Angle Lenses , 1995, ACCV.

[18]  Giulio Sandini,et al.  An anthropomorphic retina-like structure for scene analysis , 1980 .