Deriving Classification Rules from Multiple Remotely Sensed Data with Data Mining.

In a context of urban planning, it is necessary to support the identification and the formalization of the urban elements. Very often, it requires some complementary aspects of a set of images and also ancillary data. However the lack of methods enabling the combination of several sources is still compelling. In general, the use of several sources of remotely sensed data in a classification procedure results in data fusion upstream or fusion of the results. Since the appearance of VHR-images, object-oriented methods have been defined to image analysis. This approach involves segmenting images into homogeneous regions and characterizing objects with a set of features related to spectral signatures, and to spatial and contextual properties. The main issue in this approach is the definition of the knowledge base classification. Generally, the relevant information is not well-formalized and it is difficult to grasp knowledge directly from domain experts. The experts are rarely able to supply an explicit description of the knowledge they use for objects identification. In this paper, we propose to use data mining techniques to derive automatically a set of classification rules from remotely sensed data. Knowledge is extracted from a VHR-image (Quickbird MS) following an object-oriented approach. We also investigate the possibilities of acquiring matching rules from multiple classified images. These rules can help to improve the classification accuracy. They can also be used for building a multi-scale database. Experiments show the effectiveness of the proposed approach. Our first results indicate that the performance of the learnt rules is acceptably good.