Egomotion estimation of a multicamera system through line correspondences

In this paper we propose a method for estimating the egomotion of a calibrated multi-camera system from an analysis of the luminance edges. The method works entirely in the 3D space as all edges of each one set of views are previously localized, matched and back-projected onto the object space. In fact, it searches for the rigid motion that best merges the sets of 3D contours extracted from each one of the multi-views. The method uses both straight and curved 3D contours.