Active learning schemes for reduced dimensionality hyperspectral classification

Statistical schemes have certain advantages which promote their use in various pattern recognition problems. In this paper, we study the application of two statistical learning criteria for material classification of Hyperspectral remote sensing data. In most cases, the Hyperspectral data is characterized using a Gaussian mixture model (GMM). The problem in using statistical model such as the GMM is the estimation of class conditional probability density functions based on the exemplar available from the training data for each class. We demonstrate the usage of two training methods - dynamic component allocation (DCA) and the minimum message length (MML) criteria that are employed to learn the mixture observations. The training schemes are then evaluated using the Bayesian classifier.

[1]  P. Switzer,et al.  A transformation for ordering multispectral data in terms of image quality with implications for noise removal , 1988 .

[2]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[3]  Geoffrey J. McLachlan,et al.  Finite Mixture Models , 2019, Annual Review of Statistics and Its Application.

[4]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[5]  B. Everitt,et al.  Finite Mixture Distributions , 1981 .

[6]  David A. Landgrebe,et al.  Signal Theory Methods in Multispectral Remote Sensing , 2003 .

[7]  B.-H. Juang,et al.  Maximum-likelihood estimation for mixture multivariate stochastic observations of Markov chains , 1985, AT&T Technical Journal.

[8]  Louis A. Liporace,et al.  Maximum likelihood estimation for multivariate observations of Markov sources , 1982, IEEE Trans. Inf. Theory.

[9]  Dimitris G. Manolakis,et al.  Detection algorithms for hyperspectral imaging applications , 2002, IEEE Signal Process. Mag..

[10]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[11]  D. N. Geary Mixture Models: Inference and Applications to Clustering , 1989 .

[12]  Anil K. Jain,et al.  Statistical Pattern Recognition: A Review , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Nikos A. Vlassis,et al.  A Greedy EM Algorithm for Gaussian Mixture Learning , 2002, Neural Processing Letters.

[14]  G. F. Hughes,et al.  On the mean accuracy of statistical pattern recognizers , 1968, IEEE Trans. Inf. Theory.

[15]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[16]  Anil K. Jain,et al.  Unsupervised Learning of Finite Mixture Models , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Anil K. Jain,et al.  Algorithms for Clustering Data , 1988 .

[18]  Nikos A. Vlassis,et al.  A kurtosis-based dynamic approach to Gaussian mixture modeling , 1999, IEEE Trans. Syst. Man Cybern. Part A.