Random Subspace Method and Genetic Algorithm Applied to a LS-SVM Ensemble

The Least Squares formulation of SVM (LS-SVM) finds the solution by solving a set of linear equations instead of quadratic programming implemented in SVM. The LS-SVMs provide some free parameters that have to be correctly chosen in order that the performance. Lots of tools have been developed to improve their performance, mainly the development of new classifying methods and the employment of ensembles. So, in this paper, our proposal is to use both the theory of ensembles and a genetic algorithm to enhance the LS-SVM classification. First, we randomly divide the problem into subspaces to generate diversity among the classifiers of the ensemble. So, we apply a genetic algorithm to find the values of the LS-SVM parameters and also to find the weights of the linear combination of the ensemble members, used to take the final decision.

[1]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[2]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[3]  David W. Opitz,et al.  Actively Searching for an E(cid:11)ective Neural-Network Ensemble , 1996 .

[4]  S. T. Buckland,et al.  An Introduction to the Bootstrap. , 1994 .

[5]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[6]  Federico Girosi,et al.  An improved training algorithm for support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[7]  Adrião Duarte Dória Neto,et al.  Creating an ensemble of diverse support vector machines using Adaboost , 2009, 2009 International Joint Conference on Neural Networks.

[8]  David W. Opitz,et al.  Feature Selection for Ensembles , 1999, AAAI/IAAI.

[9]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Kagan Tumer,et al.  Input Decimation Ensembles: Decorrelation through Dimensionality Reduction , 2001, Multiple Classifier Systems.

[11]  Gunnar Rätsch,et al.  Soft Margins for AdaBoost , 2001, Machine Learning.

[12]  James J. Chen,et al.  Classification by ensembles from random partitions of high-dimensional data , 2007, Comput. Stat. Data Anal..

[13]  Ludmila I. Kuncheva,et al.  Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy , 2003, Machine Learning.

[14]  Francis K. H. Quek,et al.  Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets , 2003, Pattern Recognit..

[15]  Adrião Duarte Dória Neto,et al.  An genetic approach to Support Vector Machines in classification problems , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[16]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[18]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.