Infrared and visible image fusion using structure-transferring fusion method

Abstract It is commonly believed that the purpose of the image fusion is to merge as much information, such as contour, texture and intensity distribution information from original images, as possible into the fusion image. Most of the existing methods treat different source images equally with certain feature extracting operation during the fusion process. However, as for the infrared (IR) and visible image fusion problem, the features of images taken from two imaging devices with different sensitive wave bands are different, sometimes even adverse. We can’t extract and preserve the opposite information at the same time. To keep the targets salient in clutter background and visual friendly, in this paper, a novel IR and visible image fusion method called structure transferring fusion method (STF) is first proposed. Firstly, the structure-transferring model is built to transfer the grayscale structure from the visible input image into the IR image. Secondly, infrared detail enhancing strategy is carried out to supplement the missing details of the IR image. Experimental results reveal that the proposed STF method is both effective and efficient for IR and visible image fusion. The final fusion image with conspicuous targets and vivid texture is conducive to night vision surveillance for human observers.

[1]  Christine Pohl,et al.  Multisensor image fusion in remote sensing: concepts, methods and applications , 1998 .

[2]  Luciano Alparone,et al.  Remote sensing image fusion using the curvelet transform , 2007, Inf. Fusion.

[3]  Alexander Toet,et al.  Image fusion by a ration of low-pass pyramid , 1989, Pattern Recognit. Lett..

[4]  Shutao Li,et al.  Image Fusion With Guided Filtering , 2013, IEEE Transactions on Image Processing.

[5]  G. Qu,et al.  Information measure for performance of image fusion , 2002 .

[6]  Jiayi Ma,et al.  Infrared and visible image fusion via gradient transfer and total variation minimization , 2016, Inf. Fusion.

[7]  Jiayi Ma,et al.  Infrared and visible image fusion methods and applications: A survey , 2018, Inf. Fusion.

[8]  Alexander Toet,et al.  Perceptual evaluation of different image fusion schemes , 2003 .

[9]  Shuyuan Yang,et al.  Image fusion based on a new contourlet packet , 2010, Inf. Fusion.

[10]  Lei Wang,et al.  Multi-modal medical image fusion using the inter-scale and intra-scale dependencies between image shift-invariant shearlet coefficients , 2014, Inf. Fusion.

[11]  Zhifeng Gao,et al.  Fusion of infrared and visible images for night-vision context enhancement. , 2016, Applied optics.

[12]  Huixin Zhou,et al.  Multi-focus image fusion using a guided-filter-based difference image. , 2016, Applied optics.

[13]  Xiangzhi Bai,et al.  Infrared and visual image fusion through feature extraction by morphological sequential toggle operator , 2015 .

[14]  Vps Naidu,et al.  Image Fusion Technique using Multi-resolution Singular Value Decomposition , 2011 .

[15]  Jan Flusser,et al.  Image registration methods: a survey , 2003, Image Vis. Comput..

[16]  Wencheng Wang,et al.  A Multi-focus Image Fusion Method Based on Laplacian Pyramid , 2011, J. Comput..

[17]  Shiqian Wu,et al.  Weighted Guided Image Filtering , 2016, IEEE Transactions on Image Processing.

[18]  Haoye Zhang,et al.  Multi-sensor Image Decision Level Fusion Detection Algorithm Based on D-S Evidence Theory , 2014, 2014 Fourth International Conference on Instrumentation and Measurement, Computer, Communication and Control.

[19]  Shao Zhenfeng,et al.  Fusion of infrared and visible images based on focus measure operators in the curvelet domain. , 2012, Applied optics.

[20]  Zhou Wang,et al.  Objective Quality Assessment for Multiexposure Multifocus Image Fusion , 2015, IEEE Transactions on Image Processing.

[21]  Sun Li,et al.  Perceptual fusion of infrared and visible images through a hybrid multi-scale decomposition with Gaussian and bilateral filters , 2016, Inf. Fusion.

[22]  Ji Zhao,et al.  Non-rigid visible and infrared face registration via regularized Gaussian fields criterion , 2015, Pattern Recognit..

[23]  Said E El-Khamy,et al.  Blind multichannel reconstruction of high-resolution images using wavelet fusion. , 2005, Applied optics.

[24]  David R. Bull,et al.  Perceptual Image Fusion Using Wavelets , 2017, IEEE Transactions on Image Processing.

[25]  Yaonan Wang,et al.  Combination of images with diverse focuses using the spatial frequency , 2001, Inf. Fusion.

[26]  Shutao Li,et al.  Pixel-level image fusion: A survey of the state of the art , 2017, Inf. Fusion.

[27]  Yu Liu,et al.  A general framework for image fusion based on multi-scale transform and sparse representation , 2015, Inf. Fusion.

[28]  Shutao Li,et al.  Multifocus image fusion by combining curvelet and wavelet transform , 2008, Pattern Recognit. Lett..

[29]  Huixin Zhou,et al.  Infrared and visible image fusion using multiscale directional nonlocal means filter , 2015 .

[30]  Jianhua Adu,et al.  Image fusion based on visual salient features and the cross-contrast , 2016, J. Vis. Commun. Image Represent..

[31]  Li Yan,et al.  A fusion algorithm for infrared and visible images based on adaptive dual-channel unit-linking PCNN in NSCT domain , 2015 .

[32]  Jian Sun,et al.  Guided Image Filtering , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[33]  Zhengguo Li,et al.  Gradient Domain Guided Image Filtering , 2015, IEEE Transactions on Image Processing.

[34]  Alexander Toet Color remapping turns night into day , 2012 .

[35]  Baohua Zhang,et al.  The infrared and visible image fusion algorithm based on target separation and sparse representation , 2014 .

[36]  Vikram M. Gadre,et al.  Visible and NIR image fusion using weight-map-guided Laplacian–Gaussian pyramid for improving scene visibility , 2017, Sādhanā.

[37]  Hadi Seyedarabi,et al.  A non-reference image fusion metric based on mutual information of image features , 2011, Comput. Electr. Eng..

[38]  Shaowen Yao,et al.  A survey of infrared and visual image fusion methods , 2017 .