Evaluation of Performance Measures for SVR Hyperparameter Selection

To obtain accurate modeling results, it is of primal importance to find optimal values for the hyperparameters in the Support Vector Regression (SVR) model. In general, we search for those parameters that minimize an estimate of the generalization error. In this study, we empirically investigate different performance measures found in the literature: k-fold cross-validation, the computationally intensive, but almost unbiased leave-one-out error, its upper bounds -radius/margin and span bound -, Vapnik's measure, which uses an estimate of the VC dimension, and the regularized risk functional itself. For each of the estimates we focus on accuracy, complexity and the presence of local minima. The latter significantly influences the applicability of gradient-based search techniques to determine the optimal parameters.

[1]  A. Atiya,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2005, IEEE Transactions on Neural Networks.

[2]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[3]  Sayan Mukherjee,et al.  Choosing Multiple Parameters for Support Vector Machines , 2002, Machine Learning.

[4]  Bernhard Schölkopf,et al.  Experimentally optimal v in support vector regression for different noise models and parameter settings , 2004, Neural Networks.

[5]  Chih-Jen Lin,et al.  Radius Margin Bounds for Support Vector Machines with the RBF Kernel , 2002, Neural Computation.

[6]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[7]  Masayuki Karasuyama,et al.  Revised Optimizer of SVR Hyperparameters Minimizing Cross-Validation Error , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[8]  Alain Rakotomamonjy,et al.  Analysis of SVM regression bounds for variable ranking , 2007, Neurocomputing.

[9]  Chih-Jen Lin,et al.  Training nu-support vector regression: theory and algorithms. , 2002, Neural computation.

[10]  S. Sathiya Keerthi,et al.  Evaluation of simple performance measures for tuning SVM hyperparameters , 2003, Neurocomputing.

[11]  D. J. Newman,et al.  UCI Repository of Machine Learning Database , 1998 .

[12]  R. Nakano,et al.  Yet faster method to optimize SVR hyperparameters based on minimizing cross-validation error , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[13]  Ming-Wei Chang,et al.  Leave-One-Out Bounds for Support Vector Regression Model Selection , 2005, Neural Computation.

[14]  Ryohei Nakano,et al.  Optimizing Support Vector regression hyperparameters based on cross-validation , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[15]  V. Vapnik,et al.  Bounds on Error Expectation for Support Vector Machines , 2000, Neural Computation.

[16]  Wan-Jui Lee,et al.  Learning of Kernel Functions in Support Vector Machines , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[17]  James Theiler,et al.  Accurate On-line Support Vector Regression , 2003, Neural Computation.

[18]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[19]  Chih-Jen Lin,et al.  Training v-Support Vector Regression: Theory and Algorithms , 2002, Neural Computation.