MULTIRESOLUTION IMAGE FUSION: PHASE CONGRUENCYFOR SPATIAL CONSISTENCY ASSESSMENT

Multiresolution and multispectral image fusion (pan-sharpening) requires proper assessment of spectral consistency but also spatial consistency. Many fusion methods resulting in perfect spectral consistency may leak spatial consistency and vice versa, therefore a proper assessment of both spectral and spatial consistency is required. Up to now, only a few approaches were proposed for spatial consistency assessment using edge map comparison, calculated by gradient-like methods (Sobel or Laplace operators). Since image fusion may change intensity and contrast of the objects in the fused image, gradient methods may give disagreeing edge maps of the fused and reference (panchromatic) image. Unfortunately, this may lead to wrong conclusions on spatial consistency. In this paper we propose to use phase congruency for spatial consistency assessment. This measure is invariant to intensity and contrast change and allows to assess spatial consistency of fused image in multiscale way. Several assessment tests on IKONOS data allowed to compare known assessment measures and the measure based on phase congruency. It is shown that phase congruency measure has common trend with other widely used assessment measures and allows to obtain confident assessment of spatial consistency.

[1]  Peter Kovesi,et al.  Image Features from Phase Congruency , 1995 .

[2]  Xavier Otazu,et al.  Comparison between Mallat's and the ‘à trous’ discrete wavelet transform based algorithms for the fusion of multispectral and panchromatic images , 2005 .

[3]  Eero P. Simoncelli,et al.  Image quality assessment: from error visibility to structural similarity , 2004, IEEE Transactions on Image Processing.

[4]  Manfred Ehlers Spectral characteristics preserving image fusion based on Fourier domain filtering , 2004, SPIE Remote Sensing.

[5]  Zheng Liu,et al.  A feature-based metric for the quantitative evaluation of pixel-level image fusion , 2008, Comput. Vis. Image Underst..

[6]  Andrea Garzelli,et al.  Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis , 2002, IEEE Trans. Geosci. Remote. Sens..

[7]  Yun Zhang,et al.  Multi-resolution and multi-spectral image fusion for urban object extraction , 2004 .

[8]  Alan C. Bovik,et al.  Mean squared error: Love it or leave it? A new look at Signal Fidelity Measures , 2009, IEEE Signal Processing Magazine.

[9]  W. Shi,et al.  Multi-band wavelet for fusing SPOT panchromatic and multispectral images , 2003 .

[10]  Qingquan Li,et al.  A comparative analysis of image fusion methods , 2005, IEEE Transactions on Geoscience and Remote Sensing.

[11]  J. Zhou,et al.  A wavelet transform method to merge Landsat TM and SPOT panchromatic data , 1998 .

[12]  Luciano Alparone,et al.  A global quality measurement of pan-sharpened multispectral imagery , 2004, IEEE Geoscience and Remote Sensing Letters.

[13]  Roger L. King,et al.  Estimation of the Number of Decomposition Levels for a Wavelet-Based Multiresolution Multisensor Image Fusion , 2006, IEEE Transactions on Geoscience and Remote Sensing.

[14]  Valery Starovoitov,et al.  Fusion of reconstructed multispectral images , 2007, 2007 IEEE International Geoscience and Remote Sensing Symposium.

[15]  Mario Lillo-Saavedra,et al.  Fusion of multispectral and panchromatic satellite sensor imagery based on tailored filtering in the Fourier domain , 2005 .

[16]  L. Wald,et al.  Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images , 1997 .