Fusion of hyperspectral data using segmented PCT for color representation and classification

Fusion of hyperspectral data is proposed by means of partitioning the hyperspectral bands into subgroups, prior to principal components transformation (PCT). The first principal component of each subgroup is employed for image visualization. The proposed approach is general, with the number of bands in each subgroup being application dependent. Nevertheless, the paper focuses on partitions with three subgroups suitable for RGB representation. One of them employs matched-filtering based on the spectral characteristics of various materials and is very promising for classification purposes. The information content of the hyperspectral bands as well as the quality of the obtained RGB images are quantitatively assessed using measures such as the correlation coefficient, the entropy, and the maximum energy-minimum correlation index. The classification performance of the proposed partitioning approaches is tested using the K-means algorithm.

[1]  Philip K. Robertson,et al.  The Generation of Color Sequences for Univariate and Bivariate Mapping , 1986, IEEE Computer Graphics and Applications.

[2]  John A. Richards,et al.  Remote Sensing Digital Image Analysis , 1986 .

[3]  Philip K. Robertson Visualizing color gamuts: a user interface for the effective use of perceptual color spaces in data displays , 1988, IEEE Computer Graphics and Applications.

[4]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[5]  D. Roberts,et al.  Estimation of aerosol optical depth and additional atmospheric parameters for the calculation of apparent reflectance from radiance measured by the Airborne Visible/Infrared Imaging Spectrometer , 1993 .

[6]  Wallace M. Porter,et al.  The airborne visible/infrared imaging spectrometer (AVIRIS) , 1993 .

[7]  Chein-I Chang,et al.  Hyperspectral image classification and dimensionality reduction: an orthogonal subspace projection approach , 1994, IEEE Trans. Geosci. Remote. Sens..

[8]  Anil K. Jain,et al.  A Markov random field model for classification of multisource satellite imagery , 1996, IEEE Trans. Geosci. Remote. Sens..

[9]  Jan C. van der Lubbe,et al.  Information theory , 1997 .

[10]  Christine Pohl,et al.  Multisensor image fusion in remote sensing: concepts, methods and applications , 1998 .

[11]  John A. Richards,et al.  Segmented principal components transformation for efficient hyperspectral remote-sensing image display and classification , 1999, IEEE Trans. Geosci. Remote. Sens..

[12]  Ruderman,et al.  Multiscaling and information content of natural color images , 2000, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[13]  Zheng Liu,et al.  Image fusion by using steerable pyramid , 2001, Pattern Recognit. Lett..

[14]  T Achalakul,et al.  REAL-TIME MULTISPECTRAL IMAGE FUSION , 2001 .

[15]  Bhabatosh Chanda,et al.  Fusion of 2D grayscale images using multiscale morphology , 2001, Pattern Recognit..

[16]  Tiranee Achalakul,et al.  Real‐time multi‐spectral image fusion , 2001, Concurr. Comput. Pract. Exp..

[17]  Yuriy S. Shmaliy,et al.  System fusion in passive sensing using a modified hopfield network , 2001, J. Frankl. Inst..

[18]  Fu-Chun Zheng,et al.  Image fusion based on median filters and SOFM neural networks: : a three-step scheme , 2001, Signal Process..

[19]  William H. Press,et al.  Numerical recipes in C , 2002 .

[20]  J. Scott Tyo,et al.  Principal-components-based display strategy for spectral imagery , 2003, IEEE Trans. Geosci. Remote. Sens..

[21]  David A. Landgrebe,et al.  Signal Theory Methods in Multispectral Remote Sensing , 2003 .

[22]  Vassilis Tsagaris,et al.  Multispectral image fusion for improved RGB representation based on perceptual attributes , 2005 .