Applying boosting for hyperspectral classification of ore-bearing rocks

Hyperspectral sensors provide a powerful tool for non-destructive analysis of rocks. While classification of spectrally distinct materials can be performed by traditional methods, identification of different rock types or grades composed of similar materials remains a challenge because spectra are in many cases similar. In this paper, we investigate the application of boosting algorithms to classify hyperspectral data of ore rock samples into multiple discrete categories. Two variants of boosting, GentleBoost and LogitBoost, were implemented and compared with Support Vector Machines as benchmark. Two pre-processing transformations that may improve classification accuracy were investigated: derivative analysis and smoothing, both calculated by the Savitzky-Golay method. To assess the performance of the algorithms over noisy data, white Gaussian noise was added at various levels to the data set. We present experimental results using hyperspectral data collected from rock samples from an iron ore mine.

[1]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[2]  Rich Caruana,et al.  An empirical comparison of supervised learning algorithms , 2006, ICML.

[3]  Bernhard Schölkopf,et al.  Learning with kernels , 2001 .

[4]  Lorenzo Bruzzone,et al.  Kernel-based methods for hyperspectral image classification , 2005, IEEE Transactions on Geoscience and Remote Sensing.

[5]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[6]  H. M. Rajesh Application of remote sensing and GIS in mineral resource mapping - An overview , 2004 .

[7]  A. Savitzky,et al.  Smoothing and Differentiation of Data by Simplified Least Squares Procedures. , 1964 .

[8]  S. J. Sutley,et al.  Imaging spectroscopy: Earth and planetary remote sensing with the USGS Tetracorder and expert systems , 2003 .

[9]  J. Friedman Special Invited Paper-Additive logistic regression: A statistical view of boosting , 2000 .

[10]  Koby Crammer,et al.  On the Learnability and Design of Output Codes for Multiclass Problems , 2002, Machine Learning.

[11]  Fuan Tsai,et al.  Derivative analysis of hyperspectral data , 1996, Remote Sensing.

[12]  John A. Richards,et al.  Remote Sensing Digital Image Analysis: An Introduction , 1999 .

[13]  Y. Kosugi,et al.  Prediction of sweetness and amino acid content in soybean crops from hyperspectral imagery , 2007 .

[14]  Antonio Torralba,et al.  Sharing Visual Features for Multiclass and Multiview Object Detection , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  R. Shah,et al.  Least Squares Support Vector Machines , 2022 .

[16]  A. Filippi Derivative-Neural Spectroscopy for Hyperspectral Bathymetric Inversion* , 2007 .

[17]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[18]  Sophocles J. Orfanidis,et al.  Introduction to signal processing , 1995 .

[19]  H. H. Madden Comments on the Savitzky-Golay convolution method for least-squares-fit smoothing and differentiation of digital data , 1976 .

[20]  Jing Wang,et al.  A novel approach for spectral unmixing, classification, and concentration estimation of chemical and biological agents , 2006, IEEE Transactions on Geoscience and Remote Sensing.

[21]  Yoram Singer,et al.  Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers , 2000, J. Mach. Learn. Res..