Qualitative spatial reasoning to infer the camera position in generic object recognition

Qualitative spatial reasoning and qualitative representation of space is required in many applications of computer vision. We present a new approach using fuzzy spatial relations to qualitatively estimate the current camera position in an active object recognition experiment. Starting from a single image of an object and a corresponding mapping to a 3D CAD-prototype, the visibility and the occlusion of parts of the prototype are used to infer possible viewing directions on a view sphere (i.e. initial object pose estimation). This representation can be used for several tasks, e.g. refining the current viewpoint estimation, obtaining a new view and verifying the current object hypothesis. Experiments demonstrate initial view point estimations for simple CAD prototypes. This new method is applicable to generic object recognition and to other areas of qualitative vision.

[1]  Herbert Freeman,et al.  Characteristic Views As A Basis For Three-Dimensional Object Recognition , 1982, Other Conferences.

[2]  Seth J. Teller,et al.  Global visibility algorithms for illumination computations , 1993, SIGGRAPH.

[3]  Harald Ganster,et al.  Object recognition by active fusion , 1996, Other Conferences.

[4]  Gérard G. Medioni,et al.  The Challenge of Generic Object Recognition , 1994, Object Representation in Computer Vision.

[5]  Konstantinos A. Tarabanis,et al.  Computing Occlusion-Free Viewpoints , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Michael Werman,et al.  On View Likelihood and Stability , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Harald Ganster,et al.  Active fusion - A new method applied to remote sensing image interpretation , 1996, Pattern Recognit. Lett..

[8]  Azriel Rosenfeld,et al.  3-D Shape Recovery Using Distributed Aspect Matching , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Konstantinos A. Tarabanis,et al.  A survey of sensor planning in computer vision , 1995, IEEE Trans. Robotics Autom..