Simultaneous feature with support vector selection and parameters optimization using GA-based SVM solve the binary classification

Feature selection and parameters optimization is an important step in the using of SVM. In recent years, more researchers are mainly focus in feature selection and parameters optimization. However, the number of support vectors with the selected support vector subset also has an effect on classification performance of SVM. Few researchers concentrate on this area. This paper proposed a novel optimization approach which aim to select the support vector subset and feature subset simultaneously based on genetic algorithms, in optimization of while, also constantly to search the best penalty parameter C and kernel function parameters. We conduct the experiments on real-world dataset from the openly UCI Machine Learning Repository using the proposed approach and the GA-based FS technology. The experimental results show that the proposed approach can efficiently choose the optimal input features with SVM parameters and also achieve the best classification performance. Moreover, it turns out that the proposed optimization method generates a less complex SVM model with fewer support vectors.