Radial Basis Function Kernel Parameter Optimization Algorithm in Support Vector Machine Based on Segmented Dichotomy

By analyzing the influences of kernel parameter and penalty factor for generalization performance on Support Vector Machine (SVM), a novel parameter optimization algorithm based on segmented dichotomy is proposed for Radial Basis Function (RBF) kernel. Combine with Segmented Dichotomy(SD) and Gird Searching(GS) method, a composite parameter selection, SD-GS algorithm, is structured for rapid optimization of kernel parameter and penalty factor. UCI Machine Learning database is used to test our proposed method. Experimental results have shown that performance on parameter selection is better than traversal exponential grid searching. Thus, the optimized parameter combination of SD-GS algorithm enables RBF kernel in SVM to have higher generalization performance.

[1]  Bernhard E. Boser,et al.  A training algorithm for optimal margin classifiers , 1992, COLT '92.

[2]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[3]  Benjamin Naumann,et al.  Learning And Soft Computing Support Vector Machines Neural Networks And Fuzzy Logic Models , 2016 .

[4]  Bernhard Sick,et al.  Semi-Supervised Active Learning for Support Vector Machines: A Novel Approach that Exploits Structure Information in Data , 2016, Inf. Sci..

[5]  Su-Yun Huang,et al.  Model selection for support vector machines via uniform design , 2007, Comput. Stat. Data Anal..

[6]  Chih-Jen Lin,et al.  Asymptotic Behaviors of Support Vector Machines with Gaussian Kernel , 2003, Neural Computation.

[7]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[8]  Guo Chong-hui Maximum Entropy Approach for Linear SVM Optimization Problems , 2006 .

[9]  Weisi Lin,et al.  Semisupervised Biased Maximum Margin Analysis for Interactive Image Retrieval , 2012, IEEE Transactions on Image Processing.

[10]  David Burshtein,et al.  Support Vector Machine Training for Improved Hidden Markov Modeling , 2008, IEEE Transactions on Signal Processing.

[11]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[12]  Bernhard Schölkopf,et al.  Extracting Support Data for a Given Task , 1995, KDD.

[13]  Chih-Jen Lin,et al.  A Practical Guide to Support Vector Classication , 2008 .

[14]  Feng Chu,et al.  Applications of support vector machines to cancer classification with microarray data , 2005, Int. J. Neural Syst..

[15]  Chunru Wan,et al.  Classification using support vector machines with graded resolution , 2005, 2005 IEEE International Conference on Granular Computing.

[16]  A. Zell,et al.  Efficient parameter selection for support vector machines in classification and regression via model-based global optimization , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[17]  Lipo Wang,et al.  Data Mining With Computational Intelligence , 2006, IEEE Transactions on Neural Networks.