Earth observation satellites provide data covering different parts of the electromagnetic spectrum at different spatial, spectral, and temporal resolutions. To utilize these different types of image data effectively, a number of image fusion techniques have been developed. Image fusion is defined as "the set of methods, tools, and means of using data from two or more different images to improve the quality of the information" (1). The fused image has rich information that will improve the performance of image analysis algorithms. This increase in quality of the information leads to better processing (ex: classification, segmentation) accuracies compared to using the information from one type of data alone. In this paper pixel level and feature level image fusion are applied for the classification of a co-registered QuickBird multispectral and panchromatic images. I. INTRODUCTION A wide spectrum of remotely sensed data like multispectral imagery, radar imagery, hyperspectral imagery, geographical information science (GIS) map data, and Light detection and ranging (LIDAR) data are now available. For many image analysis applications information provided by one imagery type/source is incomplete or insufficient. Additional sources might provide complementary information which help to better characterize the observed land cover. Image fusion is used extensively to fuse complementary information from different sensors to provide better understanding of the observed earth surface. Image fusion takes place at three different levels: pixel, feature, and decision level (2). In pixel-level fusion, a new image is formed whose pixel values are obtained by combining the pixel values of different images through some algorithms. The new image is then used for further processing like feature extraction and classification. In feature-level fusion, the features are extracted from different types of images of the same geographic area. The extracted features are then classified using statistical or other types of classifiers. In decision-level fusion, the images are processed separately. The processed information is then refined by combining the information obtained from different sources and the differences in information are resolved based on certain decision rules. Figure 1 provides a visual interpretation of the different levels of fusion. In this paper pixel level fusion and feature level fusion were used to classify QuickBird multispectral image. QuickBird panchromatic image was used for extracting the complimentary spatial information. The classification results are compared that of the original multispectral image. II. IMAGE FUSION A. Pixel-level fusion Pansharpening is a pixel level fusion technique used to increase the spatial resolution of the multispectral image. Pansharpening techniques increase the spatial resolution while simultaneously preserving the spectral information in the multispectral data. Pansharpening is also known as resolution merge, image integration, and multisensor data fusion. Some of the applications of pansharpening include improving geometric correction, enhancing certain features not visible in either of the single data alone, change detection using temporal data sets, and enhancing classification. Different pansharpening algorithms are discussed in the literature. Pohl et al. (2) provided a detailed review of the different methods used for pansharpening and the need to assess the quality of the fused image. The Intensity-Hue- Saturation (IHS) transform based sharpening, principal component analysis (PCA) based sharpening, Brovey sharpening, regression model based sharpening, and wavelet transform based sharpening are some of the widely used techniques. The IHS and Brovey sharpening techniques provide good spatial quality but poor spectral quality. The PCA based sharpening performs better than IHS and Brovey sharpening. However the performance varies with the data used. Different wavelet based techniques are available in literature. The wavelet based techniques differ in the type of wavelet transform used, mother wavelet used and the combination rule used for combining the multispectral and panchromatic data. J. Nunez et.al, (3) used 'a trous' wavelet transform to fuse multispectral and panchromatic image. The IHS transform was used to preprocess the multispectral data and the intensity band was used in the fusion process. The details coefficients of the panchromatic image were added to the multispectral image or some of the high frequency details were replaced by the corresponding panchromatic details. R.L. King and Jainwen Wang (4) used dyadic discrete wavelet transform (DWT) and biorthogonal 9/7 mother wavelet. The details coefficients of the panchromatic image are added to the intensity component to enhance the spatial resolution. Other wavelet based techniques use redundant discrete wavelet transform (RDWT). The use of RDWT reduces some of the artifacts produced by DWT schemes. Most of these techniques are available with commercial remote sensing software packages like ERDAS Imagine ® , ENVI ® , and PCI Geomatica ® .
[1]
T. Leen,et al.
Probabilistic model-based multisensor image fusion
,
1999
.
[2]
Yaonan Wang,et al.
Using the discrete wavelet frame transform to merge Landsat TM and SPOT panchromatic images
,
2002,
Inf. Fusion.
[3]
Christine Pohl,et al.
Multisensor image fusion in remote sensing: concepts, methods and applications
,
1998
.
[4]
J. Zhou,et al.
A wavelet transform method to merge Landsat TM and SPOT panchromatic data
,
1998
.
[5]
Fabio Dell'Acqua,et al.
Extraction and fusion of street networks from fine resolution SAR data
,
2002,
IEEE International Geoscience and Remote Sensing Symposium.
[6]
Roger L. King,et al.
A wavelet based algorithm for pan sharpening Landsat 7 imagery
,
2001,
IGARSS 2001. Scanning the Present and Resolving the Future. Proceedings. IEEE 2001 International Geoscience and Remote Sensing Symposium (Cat. No.01CH37217).
[7]
Roger L. King,et al.
Estimation of the Number of Decomposition Levels for a Wavelet-Based Multiresolution Multisensor Image Fusion
,
2006,
IEEE Transactions on Geoscience and Remote Sensing.
[8]
Robert A. Schowengerdt,et al.
IKONOS Spatial Resolution and Image Interpretability Characterization
,
2003
.
[9]
Jon Atli Benediktsson,et al.
Classification of multisource and hyperspectral data based on decision fusion
,
1999,
IEEE Trans. Geosci. Remote. Sens..
[10]
Bruno Aiazzi,et al.
Multispectral fusion of multisensor image data by the generalized Laplacian pyramid
,
1999,
IEEE 1999 International Geoscience and Remote Sensing Symposium. IGARSS'99 (Cat. No.99CH36293).
[11]
Stéphane Mallat,et al.
A Theory for Multiresolution Signal Decomposition: The Wavelet Representation
,
1989,
IEEE Trans. Pattern Anal. Mach. Intell..
[12]
Alvy Ray Smith,et al.
Color gamut transform pairs
,
1978,
SIGGRAPH.
[13]
B. S. Manjunath,et al.
Multisensor Image Fusion Using the Wavelet Transform
,
1995,
CVGIP Graph. Model. Image Process..
[14]
Cedric Nishan Canagarajah,et al.
Image Fusion Using Complex Wavelets
,
2002,
BMVC.
[15]
J. E. Fowler,et al.
The redundant discrete wavelet transform and additive noise
,
2005,
IEEE Signal Processing Letters.
[16]
Xavier Otazu,et al.
Multiresolution-based image fusion with additive wavelet decomposition
,
1999,
IEEE Trans. Geosci. Remote. Sens..
[17]
Nicolas H. Younan,et al.
Quantitative analysis of pansharpened images
,
2006
.
[18]
W. Shi,et al.
Multi-band wavelet for fusing SPOT panchromatic and multispectral images
,
2003
.
[19]
Lucien Wald,et al.
Some terms of reference in data fusion
,
1999,
IEEE Trans. Geosci. Remote. Sens..
[20]
George P. Lemeshewsky.
Multispectral multisensor image fusion using wavelet transforms
,
1999,
Defense, Security, and Sensing.
[21]
Edward H. Adelson,et al.
The Laplacian Pyramid as a Compact Image Code
,
1983,
IEEE Trans. Commun..
[22]
Rafael García,et al.
Fusion of multispectral and panchromatic images using improved IHS and PCA mergers based on wavelet decomposition
,
2004,
IEEE Transactions on Geoscience and Remote Sensing.
[23]
David A. Landgrebe,et al.
Decision fusion approach for multitemporal classification
,
1999,
IEEE Trans. Geosci. Remote. Sens..
[24]
Thierry Blu,et al.
Using iterated rational filter banks within the ARSIS concept for producing 10 m Landsat multispectral images
,
1998
.