Image fusion is a tool for integrating a high-resolution panchromatic image with a multispectral image, in which the resulting fused image contains both the high-resolution spatial information of the panchromatic image and the color information of the multispectral image. Wavelet transformation, originally a mathematical tool for signal processing, is now popular in the field of image fusion. Recently, many image fusion methods based on wavelet transformation have been published. The wavelets used in image fusion can be categorized into three general classes: Orthogonal, Biorthogonal and Nonorthogonal. Although these wavelets share some common properties, each wavelet leads to unique image decomposition and a reconstruction method which leads to differences among wavelet fusion methods. This paper focuses on the comparison of the image fusion methods which utilize the wavelets of the above three general classes. The typical wavelets from the above three general classes – Daubechies (Orthogonal), spline biorthogonal (Biorthogonal), and A trous (Nonorthogonal) – are selected as the mathematical models to implement image fusion algorithms. When wavelet transformation alone is used for image fusion, the fusion result is often not good. However, if wavelet transform and IHS transform are integrated, better fusion results may be achieved. Because the substitution in IHS transform is limited to only the intensity component, integrating of the wavelet transform to improve or modify the intensity and the IHS transform to fuse the image can make the fusion process simpler and faster. This integration can also better preserve color information. The fusion method based on the above IHS and wavelet integration concept is employed in this paper. IKONOS image data are used to evaluate the three different kinds of wavelet fusion methods mentioned above. The fusion results are compared graphically, visually, and statistically.
[1]
D. Yocky.
Image merging and data fusion by means of the discrete two-dimensional wavelet transform
,
1995
.
[2]
Christine Pohl,et al.
Multisensor image fusion in remote sensing: concepts, methods and applications
,
1998
.
[3]
Xavier Otazu,et al.
Multiresolution-based image fusion with additive wavelet decomposition
,
1999,
IEEE Trans. Geosci. Remote. Sens..
[4]
J. Zhou,et al.
A wavelet transform method to merge Landsat TM and SPOT panchromatic data
,
1998
.
[5]
Stéphane Mallat,et al.
A Theory for Multiresolution Signal Decomposition: The Wavelet Representation
,
1989,
IEEE Trans. Pattern Anal. Mach. Intell..
[6]
D. Yocky.
Multiresolution wavelet decomposition image merger of landsat thematic mapper and SPOT panchromatic data
,
1996
.
[7]
L. Wald,et al.
Fusion of high spatial and spectral resolution images : The ARSIS concept and its implementation
,
2000
.
[8]
Gang Hong Gang Hong,et al.
High resolution image fusion based on wavelet and IHS transformations
,
2003,
2003 2nd GRSS/ISPRS Joint Workshop on Remote Sensing and Data Fusion over Urban Areas.
[9]
B. S. Manjunath,et al.
Multisensor Image Fusion Using the Wavelet Transform
,
1995,
CVGIP Graph. Model. Image Process..
[10]
C. Burrus,et al.
Introduction to Wavelets and Wavelet Transforms: A Primer
,
1997
.
[11]
Andrea Garzelli,et al.
Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis
,
2002,
IEEE Trans. Geosci. Remote. Sens..