GTP-PNet: A residual learning network based on gradient transformation prior for pansharpening

Abstract Pansharpening aims to fuse low-resolution multi-spectral image and high-resolution panchromatic (PAN) image to produce a high-resolution multi-spectral (HRMS) image. In this paper, a new residual learning network based on gradient transformation prior, termed as GTP-PNet, is proposed to generate the high-quality HRMS image with accurate spectral distribution as well as reasonable spatial structure. Different from previous deep models that only rely on supervision of the HRMS reference image, we introduce the gradient transformation prior to the deep model, so as to improve the solution accuracy. Our model consists of two networks, namely gradient transformation network (TNet) and pansharpening network (PNet). TNet is committed to seeking the nonlinear mapping between gradients of PAN and HRMS images, which is essentially a spatial relationship regression of imaging bands in different ranges. PNet is the residual learning network used to generate the HRMS image, which is not only supervised by the HRMS reference image, but also constrained by the trained TNet. As a result, the HRMS image generated by PNet not only approximates the HRMS reference image in the spectral distribution, but also conforms to the gradient transformation prior in the spatial structure. Experimental results demonstrate the significant superiority of our method over the current state-of-the-arts in terms of both subjective visual effect and quantitative metrics. We also apply our method to produce the HR normalized difference vegetation index in remote sensing, which can achieve the best performance. Moreover, our method is much competitive compared with the state-of-the-art alternatives in running efficiency.

[1]  W. J. Carper,et al.  The use of intensity-hue-saturation transformations for merging SPOT panchromatic and multispectral image data , 1990 .

[2]  Oguz Gungor,et al.  Metaheuristic pansharpening based on symbiotic organisms search optimization , 2019 .

[3]  Delu Zeng,et al.  Pan-Sharpening with a Hyper-Laplacian Penalty , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[4]  Bruno Aiazzi,et al.  Improving Component Substitution Pansharpening Through Multivariate Regression of MS $+$Pan Data , 2007, IEEE Transactions on Geoscience and Remote Sensing.

[5]  Te-Ming Tu,et al.  A new look at IHS-like image fusion methods , 2001, Inf. Fusion.

[6]  Hassan Ghassemian,et al.  Remote-sensing image fusion based on curvelets and ICA , 2015 .

[7]  Myungjin Choi,et al.  A new intensity-hue-saturation fusion approach to image fusion with a tradeoff parameter , 2006, IEEE Trans. Geosci. Remote. Sens..

[8]  Bo Du,et al.  Hyperspectral Remote Sensing Image Subpixel Target Detection Based on Supervised Metric Learning , 2014, IEEE Transactions on Geoscience and Remote Sensing.

[9]  Chen Chen,et al.  Pan-GAN: An unsupervised pan-sharpening method for remote sensing image fusion , 2020, Inf. Fusion.

[10]  Hao Zhang,et al.  NDVI-Net: A fusion network for generating high-resolution normalized difference vegetation index in remote sensing , 2020 .

[11]  David Zhang,et al.  FSIM: A Feature Similarity Index for Image Quality Assessment , 2011, IEEE Transactions on Image Processing.

[12]  J. E. Bare,et al.  Application of the IHS color transform to the processing of multisensor data and image enhancement , 1982 .

[13]  Davide Cozzolino,et al.  Pansharpening by Convolutional Neural Networks , 2016, Remote. Sens..

[14]  L. Wald,et al.  Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images , 1997 .

[15]  S. Sides,et al.  Comparison of three different methods to merge multiresolution and multispectral data: Landsat TM and SPOT panchromatic , 1991 .

[16]  Oguz Gungor,et al.  Genetic algorithm-based synthetic variable ratio image fusion , 2019 .

[17]  Jocelyn Chanussot,et al.  A Two-Stream Multiscale Deep Learning Architecture for Pan-Sharpening , 2020, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing.

[18]  Liangpei Zhang,et al.  Boosting the Accuracy of Multispectral Image Pansharpening by Learning a Deep Residual Network , 2017, IEEE Geoscience and Remote Sensing Letters.

[19]  Hassan Ghassemian,et al.  A review of remote sensing image fusion methods , 2016, Inf. Fusion.

[20]  Junfeng Yang,et al.  PanNet: A Deep Network Architecture for Pan-Sharpening , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[21]  Xinming Tang,et al.  IMAGE FUSION AND IMAGE QUALITY ASSESSMENT OF FUSED IMAGES , 2013 .

[22]  Jocelyn Chanussot,et al.  Synthesis of Multispectral Images to High Spatial Resolution: A Critical Review of Fusion Methods Based on Remote Sensing Physics , 2008, IEEE Transactions on Geoscience and Remote Sensing.

[23]  H. Ghassemian,et al.  Remote sensing image fusion using combining IHS and Curvelet transform , 2012, 6th International Symposium on Telecommunications (IST).

[24]  C. Tucker Red and photographic infrared linear combinations for monitoring vegetation , 1979 .

[25]  S. Baronti,et al.  Multispectral and panchromatic data fusion assessment without reference , 2008 .

[26]  Jocelyn Chanussot,et al.  Context-Adaptive Pansharpening Based on Image Segmentation , 2017, IEEE Transactions on Geoscience and Remote Sensing.

[27]  Michael Möller,et al.  A Variational Approach for Sharpening High Dimensional Images , 2012, SIAM J. Imaging Sci..

[28]  Xinghao Ding,et al.  A Variational Pan-Sharpening With Local Gradient Constraints , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[29]  Jocelyn Chanussot,et al.  Full Scale Regression-Based Injection Coefficients for Panchromatic Sharpening , 2018, IEEE Transactions on Image Processing.

[30]  Laura Igual,et al.  A Variational Model for P+XS Image Fusion , 2006, International Journal of Computer Vision.

[31]  Cigdem Serifoglu Yilmaz,et al.  On the use of the SOS metaheuristic algorithm in hybrid image fusion methods to achieve optimum spectral fidelity , 2020 .

[32]  Gemine Vivone,et al.  Robust Band-Dependent Spatial-Detail Approaches for Panchromatic Sharpening , 2019, IEEE Transactions on Geoscience and Remote Sensing.

[33]  A. Bovik,et al.  A universal image quality index , 2002, IEEE Signal Processing Letters.

[34]  Kiyun Yu,et al.  A New Adaptive Component-Substitution-Based Satellite Image Fusion by Using Partial Replacement , 2011, IEEE Transactions on Geoscience and Remote Sensing.

[35]  Hao Zhang,et al.  Rethinking the Image Fusion: A Fast Unified Image Fusion Network based on Proportional Maintenance of Gradient and Intensity , 2020, AAAI.

[36]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[37]  Jie Shan,et al.  A genetic algorithm solution to the gram-schmidt image fusion , 2019, International Journal of Remote Sensing.

[38]  Jan Kautz,et al.  Loss Functions for Image Restoration With Neural Networks , 2017, IEEE Transactions on Computational Imaging.

[39]  Nicolas Dobigeon,et al.  Robust Fusion of Multiband Images With Different Spatial and Spectral Resolutions for Change Detection , 2016, IEEE Transactions on Computational Imaging.

[40]  Kilian Q. Weinberger,et al.  Densely Connected Convolutional Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[41]  Eero P. Simoncelli,et al.  Image quality assessment: from error visibility to structural similarity , 2004, IEEE Transactions on Image Processing.

[42]  Wei Liu,et al.  SIRF: Simultaneous Satellite Image Registration and Fusion in a Unified Framework , 2015, IEEE Transactions on Image Processing.

[43]  Lucien Wald,et al.  Data Fusion. Definitions and Architectures - Fusion of Images of Different Spatial Resolutions , 2002 .

[44]  J. Strobl,et al.  Object-Oriented Image Processing in an Integrated GIS/Remote Sensing Environment and Perspectives for Environmental Applications , 2000 .