Increasing Classification Robustness with Adaptive Features

In machine vision features are the basis for almost any kind of high-level postprocessing such as classification. A new method is developed that uses the inherent flexibility of feature calculation to optimize the features for a certain classification task. By tuning the parameters of the feature calculation the accuracy of a subsequent classification can be significantly improved and the decision boundaries can be simplified. The focus of the methods is on surface inspection problems and the features and classifiers used for these applications.

[1]  Keinosuke Fukunaga,et al.  A Branch and Bound Algorithm for Feature Subset Selection , 1977, IEEE Transactions on Computers.

[2]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[3]  C. R. Rao,et al.  Linear Statistical Inference and its Applications , 1968 .

[4]  Lawrence Carin,et al.  A Bayesian approach to joint feature selection and classifier design , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Mohamed S. Kamel,et al.  Image Analysis and Recognition , 2014, Lecture Notes in Computer Science.

[6]  Igor Kononenko,et al.  Estimating Attributes: Analysis and Extensions of RELIEF , 1994, ECML.

[7]  Luc De Raedt,et al.  Machine Learning: ECML-94 , 1994, Lecture Notes in Computer Science.

[8]  M.R. Azimi-Sadjadi,et al.  Adaptive feature mapping for underwater target classification , 1999, IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339).

[9]  Yanxi Liu,et al.  Online selection of discriminative tracking features , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Alberto Maria Segre,et al.  Programs for Machine Learning , 1994 .

[11]  Ron Kohavi,et al.  Wrappers for Feature Subset Selection , 1997, Artif. Intell..

[12]  Lluís A. Belanche Muñoz,et al.  Feature selection algorithms: a survey and experimental evaluation , 2002, 2002 IEEE International Conference on Data Mining, 2002. Proceedings..

[13]  Leo Breiman,et al.  Classification and Regression Trees , 1984 .

[14]  Abdelmonem A. Afifi,et al.  Comparison of Stopping Rules in Forward Stepwise Discriminant Analysis , 1979 .

[15]  Huan Liu,et al.  Feature Selection for Classification , 1997, Intell. Data Anal..

[16]  Christian Eitzinger,et al.  A new approach to perceptron training , 2003, IEEE Trans. Neural Networks.

[17]  Hans Burkhardt,et al.  Feature Selection for Retrieval Purposes , 2006, ICIAR.

[18]  Claire Cardie,et al.  Using Decision Trees to Improve Case-Based Learning , 1993, ICML.

[19]  Tyng-Luh Liu,et al.  Probabilistic tracking with adaptive feature selection , 2004, ICPR 2004.

[20]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[21]  P. Rousseeuw,et al.  Wiley Series in Probability and Mathematical Statistics , 2005 .

[22]  David J. Hand,et al.  Discrimination and Classification , 1982 .

[23]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[24]  Edwin Lughofer,et al.  Extensions of vector quantization for incremental clustering , 2008, Pattern Recognit..

[25]  John E. Dennis,et al.  Numerical methods for unconstrained optimization and nonlinear equations , 1983, Prentice Hall series in computational mathematics.