RSGALS-SVM: Random Subspace Method Applied to a LS-SVM Ensemble Optimized by Genetic Algorithm

The Support Vector Machines (SVMs) have received great emphasis in the pattern classification due its good ability to generalize. The Least Squares formulation of SVM (LS-SVM) finds the solution by solving a set of linear equations instead of quadratic programming. Both the SVMs and the LS-SVMs provide some free parameters that have to be tuned to reflect the requirements of the given task. Despite their high performance, lots of tools have been developed to improve them, mainly the development of new classifying methods and the employment of ensembles. So, in this paper, our proposal is to use both the theory of ensembles and a genetic algorithm to enhance the LS-SVM classification. First, we randomly divide the problem into subspaces to generate diversity among the classifiers of the ensemble. So, we apply a genetic algorithm to optimize the classification of this ensemble of LS-SVM, testing with some benchmark data sets.

[1]  Gunnar Rätsch,et al.  Soft Margins for AdaBoost , 2001, Machine Learning.

[2]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[3]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[4]  David W. Opitz,et al.  Feature Selection for Ensembles , 1999, AAAI/IAAI.

[5]  Federico Girosi,et al.  An improved training algorithm for support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[6]  Adrião Duarte Dória Neto,et al.  Creating an ensemble of diverse support vector machines using Adaboost , 2009, 2009 International Joint Conference on Neural Networks.

[7]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[8]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  R. Fletcher,et al.  On the Stability of Null-Space Methods for KKT Systems , 1997 .

[10]  Ludmila I. Kuncheva,et al.  Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy , 2003, Machine Learning.

[11]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[13]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[14]  R. Tibshirani,et al.  An introduction to the bootstrap , 1993 .

[15]  Adrião Duarte Dória Neto,et al.  An genetic approach to Support Vector Machines in classification problems , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[16]  David W. Opitz,et al.  Actively Searching for an E(cid:11)ective Neural-Network Ensemble , 1996 .

[17]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.