Hyperspectral data analysis and supervised feature reduction via projection pursuit

As the number of spectral bands of high-spectral resolution data increases, the ability to detect more detailed classes should also increase, and the classification accuracy should increase as well. Often the number of labelled samples used for supervised classification techniques is limited, thus limiting the precision with which class characteristics can be estimated. As the number of spectral bands becomes large, the limitation on performance imposed by the limited number of training samples can become severe. A number of techniques for case-specific feature extraction have been developed to reduce dimensionality without loss of class separability. Most of these techniques require the estimation of statistics at full dimensionality in order to extract relevant features for classification. If the number of training samples is not adequately large, the estimation of parameters in high-dimensional data will not be accurate enough. As a result, the estimated features may not be as effective as they could be. This suggests the need for reducing the dimensionality via a preprocessing method that takes into consideration high-dimensional feature-space properties. Such reduction should enable the estimation of feature-extraction parameters to be more accurate. Using a technique referred to as projection pursuit (PP), such an algorithm has been developed. This technique is able to bypass many of the problems of the limitation of small numbers of training samples by making the computations in a lower-dimensional space, and optimizing a function called the projection index. A current limitation of this method is that, as the number of dimensions increases, it is likely that a local maximum of the projection index will be found that does not enable one to fully exploit hyperspectral-data capabilities.

[1]  David A. Landgrebe,et al.  Projection pursuit in high dimensional data reduction: initial conditions, feature selection and the assumption of normality , 1995, 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century.

[2]  David A. Landgrebe,et al.  Projection pursuit for high dimensional feature reduction: parallel and sequential approaches , 1995, 1995 International Geoscience and Remote Sensing Symposium, IGARSS '95. Quantitative Remote Sensing for Science and Applications.

[3]  G. F. Hughes,et al.  On the mean accuracy of statistical pattern recognizers , 1968, IEEE Trans. Inf. Theory.

[4]  Keinosuke Fukunaga,et al.  Effects of Sample Size in Classifier Design , 1989, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[6]  Sholom M. Weiss,et al.  Computer Systems That Learn , 1990 .

[7]  John A. Richards,et al.  Remote Sensing Digital Image Analysis , 1986 .

[8]  John A. Richards,et al.  Remote Sensing Digital Image Analysis: An Introduction , 1999 .

[9]  Philip H. Swain,et al.  Remote Sensing: The Quantitative Approach , 1981, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  P. Hall Estimating the direction in which a data set is most interesting , 1988 .

[11]  David A. Landgrebe,et al.  Supervised classification in high-dimensional space: geometrical, statistical, and asymptotical properties of multivariate data , 1998, IEEE Trans. Syst. Man Cybern. Part C.

[12]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[13]  David W. Scott,et al.  Multivariate Density Estimation: Theory, Practice, and Visualization , 1992, Wiley Series in Probability and Statistics.

[14]  John W. Tukey,et al.  A Projection Pursuit Algorithm for Exploratory Data Analysis , 1974, IEEE Transactions on Computers.

[15]  P. Hall On Projection Pursuit Regression , 1989 .

[16]  Ronald L. Rivest,et al.  Constructing Optimal Binary Decision Trees is NP-Complete , 1976, Inf. Process. Lett..

[17]  Anil K. Jain,et al.  On the optimal number of features in the classification of multivariate Gaussian data , 1978, Pattern Recognit..

[18]  David A. Landgrebe,et al.  High dimensional feature reduction via projection pursuit , 1994, Proceedings of IGARSS '94 - 1994 IEEE International Geoscience and Remote Sensing Symposium.

[19]  David A. Landgrebe,et al.  Hierarchical Classification In High Dimensional, Numerous Class Cases , 1990, 10th Annual International Symposium on Geoscience and Remote Sensing.

[20]  David A. Landgrebe,et al.  Feature extraction and classification algorithms for high-dimensional data , 1992 .

[21]  J. Friedman,et al.  Projection Pursuit Regression , 1981 .

[22]  D. Freedman,et al.  Asymptotics of Graphical Projection Pursuit , 1984 .

[23]  J. Friedman,et al.  PROJECTION PURSUIT DENSITY ESTIMATION , 1984 .

[24]  Jenq-Neng Hwang,et al.  Nonparametric multivariate density estimation: a comparative study , 1994, IEEE Trans. Signal Process..

[25]  David A. Landgrebe,et al.  Feature Extraction Based on Decision Boundaries , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[26]  Jan M. Van Campenhout,et al.  On the Possible Orderings in the Measurement Selection Problem , 1977, IEEE Transactions on Systems, Man, and Cybernetics.

[27]  Ker-Chau Li,et al.  On almost Linearity of Low Dimensional Projections from High Dimensional Data , 1993 .

[28]  David A. Landgrebe,et al.  A survey of decision tree classifier methodology , 1991, IEEE Trans. Syst. Man Cybern..