FUSION OF OPTICAL AND RADAR REMOTE SENSING DATA: MUNICH CITY EXAMPLE

Fusion of optical and radar remote sensing data is becoming an actual topic recently in various application areas though the results are not always satisfactory. In this paper we analyze some disturbing aspects of fusing orthoimages from sensors having different acquisition geometries. These aspects are errors in DEM used for image orthorectification and existence of 3D objects in the scene. We analyze how these effects influence a ground displacement in orthoimages produced from optical and radar data. Further, we propose a sensor formation with acquisition geometry parameters which allows to minimize or compensate for ground displacements in different orthoimages due the above mentioned effects and to produce good prerequisites for the following fusion for specific application areas e.g. matching, filling data gaps, classification etc. To demonstrate the potential of the proposed approach two pairs of optical-radar data were acquired over the urban area – Munich city, Germany. The first collection of WorldView-1 and TerraSAR-X data followed the proposed recommendations for acquisition geometry parameters, whereas the second collection of IKONOS and TerraSAR-X data was acquired with accidental parameters. The experiment fully confirmed our ideas. Moreover, it opens new possibilities for optical and radar image fusion.