An objective evaluation metric for color image fusion

Image fusion has been extensively studied in past two decades. By image fusion algorithms, a composite image (i.e., fused image) can be formed with several images from different sensors. The performance of image fusion methods can be assessed using subjective and/or objective measures. However, subjective evaluation involves human subjects, which significantly increases the cost of time and resource. In this paper, we will discuss objective evaluations of color image fusion algorithms. Given a reference color image and fused color images, we first convert the images into CIELab color space. Then we define four image metrics in CIELab space: the phase congruency metric (PCM), the image gradient magnitude metric (IGMM), the image contrast metric (ICM), and the color natural metric (CNM). Finally, with the four metrics, we propose an objective evaluation index (OEI) for a fused image to measure its similarity with the reference image. The larger the OEI value of a fused image is, the more similar the fused image is with the reference image. To validate the proposed metric, first the fused images are formed with different color fusion algorithms using a set of multispectral images (including visible color images, near infrared images, and long wave infrared images); and then the OEIs of fused images are calculated and compared. Experimental results show that the proposed objective evaluation index is very promising and fits well to subjective evaluation.

[1]  Gabriel Cristóbal,et al.  Self-Invertible 2D Log-Gabor Wavelets , 2007, International Journal of Computer Vision.

[2]  D. Burr,et al.  Mach bands are phase dependent , 1986, Nature.

[3]  Vassilis Tsagaris,et al.  Global measure for assessing image fusion methods , 2006 .

[4]  Wei Wang,et al.  Design and implementation of Log-Gabor filter in fingerprint image enhancement , 2008, Pattern Recognit. Lett..

[5]  Luciano Alparone,et al.  A global quality measurement of pan-sharpened multispectral imagery , 2004, IEEE Geoscience and Remote Sensing Letters.

[6]  Robyn A. Owens,et al.  Feature detection from local energy , 1987, Pattern Recognit. Lett..

[7]  Miao Ma,et al.  New method to quality evaluation for image fusion using gray relational analysis , 2005 .

[8]  Benkang Chang,et al.  Objective quality evaluation of visible and infrared color fusion image , 2011 .

[9]  Peter Kovesi,et al.  Image Features from Phase Congruency , 1995 .

[10]  David Zhang,et al.  FSIM: A Feature Similarity Index for Image Quality Assessment , 2011, IEEE Transactions on Image Processing.

[11]  D. Burr,et al.  Feature detection in human vision: a phase-dependent energy model , 1988, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[12]  L. Wald,et al.  Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images , 1997 .

[13]  Dennis Gabor,et al.  Theory of communication , 1946 .

[14]  Bernard Gosselin,et al.  Character Segmentation-by-Recognition Using Log-Gabor Filters , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[15]  Yufeng Zheng,et al.  A channel-based color fusion technique using multispectral images for night vision enhancement , 2011, Optical Engineering + Applications.

[16]  Vassilis Tsagaris Objective evaluation of color image fusion methods , 2009 .

[17]  Aapo Hyvärinen,et al.  Representation of Cross-Frequency Spatial Phase Relationships in Human Visual Cortex , 2009, The Journal of Neuroscience.

[18]  D J Field,et al.  Relations between the statistics of natural images and the response properties of cortical cells. , 1987, Journal of the Optical Society of America. A, Optics and image science.