A Practical Parameters Selection Method for SVM

The performance of Support Vector Machine (SVM) is significantly affected by model parameters. One commonly used parameters selection method of SVM, Grid search (GS) method, is very time consuming. Present paper introduces Uniform Design (UD) and Support Vector Regression (SVR) method to reduce the computation cost of traditional GS method: the error bounds of SVM are only computed on some nodes that are selected by UD method, then a Support Vector Regression (SVR) are trained by the computation results. Subsequently, the values of error bound of SVM on other nodes are estimated by the SVR function and the optimized parameters can be selected based on the estimated results. Experiments on seven standard datasets show that parameters selected by proposed method can result in similar test error rate as that obtained by conventional GS method, while the computation cost can be reduced at most from o( n m ) to o(n), where m is the number of parameters, n is the number of levels of each parameter.