Real-time dense stereo mapping for multi-sensor navigation

Reliable estimation of position and orientation of the system in the 3D-world is normally the first and absolutely necessary requirement for the functional and operational capability of any unpiloted moving system for different real time applications. A multi sensor approach for realisation of this task in combination with an optimal fusion of these measurements provides the best estimation accuracy of parameters. Depending on the applications the main inertial sensors will be combined with one or more other specific sensors like optical sensor (stereo cameras), GPS and others. The main investigation points are the development of the methods and algorithms for complex state estimation and their implementation in a real-time software and hardware solution, which serves as a base navigation system for different applications, for example for indoor navigation, driver assistance for vehicles and trains or the estimation of exterior orientation for airborne and space borne camera systems and attitude control for small satellites. An integral positioning system (IPS) was deployed and tested based on investigations. The derivation of high quality products from the data of the optical stereo sensor can strongly improve the resulting navigation information. Additional to feature based stereo matching algorithms which can be executed on a standard mobile computer in realtime we use a GPU-implementation of the high quality Semi-Global Matching (SGM) algorithm. It can compete with the currently best global stereo methods in quality, at the same time it is much more efficient. Precondition for a real-time SGM approach is a special epipolar geometry of the input images: all epipolar lines must be parallel to the x-axis. Therefore the stereo camera system parameters of interior and exterior orientation have to be accurately determined. We have fully implemented the image rectification step and the SGM algorithm with several cost functions with OpenGL/Cg. This processing unit results in a real time 3D system called E3D and can be combined with the IPS sensor head. The combination of both systems IPS and E3D allows improving the quality of data products. Disparity data integrity can be checked by controlling the orientation parameters of the stereo cameras, 3D points can be referenced in time and space, feature based matching can be improved and speeded up by using a priori knowledge from the dense disparity map, 3D points at infinity can be used for determining the rotation part of the ego motion. Such a multi sensor system is a perfect platform for SLAM applications. * Corresponding author. This is useful to know for communication with the appropriate person in cases with more than one author.