Frontally placed eyes versus laterally placed eyes: computational comparison of their functions for ego-motion estimation.

Both frontally placed eyes and laterally placed eyes are popular in nature, and although which one is better could be one of the most intuitive questions to ask, it could also be the hardest question to answer. Their most obvious difference is that, at least as supposed in the computer vision community, stereopsis plays the central role in the visual system composed of frontally placed eyes (or cameras); however, it is not available in the lateral configuration due to the lack of overlap between the visual fields. As a result, researchers have adopted completely different approaches to model the two configurations and developed computational mimics of them to address various vision problems. Recently, the advent of novel quasi-parallax conception unifies the ego-motion estimation procedure of these two eye configurations into the same framework and makes systematic comparison feasible. In this paper, we intend to establish the computational superiority of eye topography from the perspective of ego-motion estimation. Specifically, quasi-parallax is applied to fuse motion cues from individual cameras at an early stage, at the pixel level, and to recover the translation and rotation separately with high accuracy and efficiency without the need of feature matching. Furthermore, its applicability on the extended sideways arrangements is studied successfully to make our comparison more general and insightful. Extensive experiments on both synthetic and real data have been done, and the computational superiority of the lateral configuration is verified.

[1]  Giulio Sandini,et al.  3D object reconstruction using stereo and motion , 1989, IEEE Trans. Syst. Man Cybern..

[2]  Robert Pless Camera cluster in motion: motion estimation for generalized camera designs , 2004, IEEE Robotics & Automation Magazine.

[3]  P. Anandan,et al.  Direct Recovery of Planar-Parallax from Multiple Frames , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Loong Fah Cheong,et al.  Linear Quasi-Parallax SfM Using Laterally-Placed Eyes , 2009, International Journal of Computer Vision.

[5]  Loong Fah Cheong,et al.  Quasi-Parallax for Nearly Parallel Frontal Eyes , 2012, International Journal of Computer Vision.

[6]  Olivier Faugeras,et al.  Maintaining representations of the environment of a mobile robot , 1988, IEEE Trans. Robotics Autom..

[7]  Ronald Chung,et al.  Stereo-Motion with Stereo and Motion in Complement , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Hans-Joachim Bischof,et al.  Eye movements of laterally eyed birds are not independent , 2009, Journal of Experimental Biology.

[9]  Hongdong Li,et al.  Motion Estimation for Nonoverlapping Multicamera Rigs: Linear Algebraic and L∞ Geometric Solutions , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Yun Q. Shi,et al.  Unified optical flow field approach to motion analysis from a sequence of stereo images , 1994, Pattern Recognit..

[11]  Olivier D. Faugeras,et al.  Multi-View Stereo Reconstruction and Scene Flow Estimation with a Global Image-Based Matching Score , 2007, International Journal of Computer Vision.

[12]  Giorgio Vallortigara,et al.  How birds use their eyes Opposite left-right specialization for the lateral and frontal visual hemifield in the domestic chick , 2001, Current Biology.

[13]  Hongsheng Zhang,et al.  Epiflow - A paradigm for tracking stereo correspondences , 2008, Comput. Vis. Image Underst..