Semi-supervised feature learning for hyperspectral image classification

Hyperspectral image has high-dimensional Spectral–spatial features, those features with some noisy and redundant information. Since redundant features can have significant adverse effect on learning performance. So efficient and robust feature selection methods are make the best of labeled and unlabeled points to extract meaningful features and eliminate noisy ones. On the other hand, obtaining sufficient accurate labeled data is either impossible or expensive. In order to take advantage of both precious labeled and unlabeled data points, in this paper, we propose a new semisupervised feature selection method, Firstly, we use labeled points are to enlarge the margin between data points from different classes; Secondly, we use unlabeled points to find the local structure of the data space; Finally, we compare our proposed algorithm with Fisher score, PCA and Laplacian score on HSI classification. Experimental results on benchmark hyperspectral data sets demonstrate the efficiency and effectiveness of our proposed algorithm.

[1]  Ian T. Jolliffe,et al.  Principal Component Analysis , 2002, International Encyclopedia of Statistical Science.

[2]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[3]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[4]  Chein-I. Chang Hyperspectral Data Exploitation: Theory and Applications , 2007 .

[5]  James E. Fowler,et al.  Locality-Preserving Dimensionality Reduction and Classification for Hyperspectral Image Analysis , 2012, IEEE Transactions on Geoscience and Remote Sensing.

[6]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[7]  Hiroshi Motoda,et al.  Feature Selection for Knowledge Discovery and Data Mining , 1998, The Springer International Series in Engineering and Computer Science.

[8]  Feiping Nie,et al.  Discriminative Least Squares Regression for Multiclass Classification and Feature Selection , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Jane Labadin,et al.  Feature selection based on mutual information , 2015, 2015 9th International Conference on IT in Asia (CITA).

[10]  Fuhui Long,et al.  Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Deng Cai,et al.  Laplacian Score for Feature Selection , 2005, NIPS.

[12]  David A. Landgrebe,et al.  Analyzing high-dimensional multispectral data , 1993, IEEE Trans. Geosci. Remote. Sens..

[13]  Masoud Nikravesh,et al.  Feature Extraction - Foundations and Applications , 2006, Feature Extraction.

[14]  Fan Chung,et al.  Spectral Graph Theory , 1996 .

[15]  Zi Huang,et al.  Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence ℓ2,1-Norm Regularized Discriminative Feature Selection for Unsupervised Learning , 2022 .

[16]  Bo Du,et al.  Semisupervised Discriminative Locally Enhanced Alignment for Hyperspectral Image Classification , 2013, IEEE Transactions on Geoscience and Remote Sensing.

[17]  Ron Kohavi,et al.  Feature Selection for Knowledge Discovery and Data Mining , 1998 .

[18]  Stan Lipovetsky,et al.  PCA and SVD with nonnegative loadings , 2009, Pattern Recognit..