A Leave-One-Out Bound for ν -Support Vector Regression

An upper bound on the Leave-one-out (Loo) error for ν-support vector regression (ν-SVR) is presented. This bound is based on the geometrical concept of span. We can select the parameters of ν-SVR by minimizing this upper bound instead of the error itself, because the computation of the Loo error is extremely time consuming. We also can estimate the generalization performance of ν-SVR with the help of the upper bound. It is shown that the bound presented herein provide informative and efficient approximations of the generalization behavior based on two data sets.

[1]  V. Vapnik,et al.  Bounds on Error Expectation for Support Vector Machines , 2000, Neural Computation.

[2]  Thorsten Joachims,et al.  Estimating the Generalization Performance of an SVM Efficiently , 2000, ICML.

[3]  Alexander J. Smola,et al.  Learning with Kernels: support vector machines, regularization, optimization, and beyond , 2001, Adaptive computation and machine learning series.

[4]  Yingjie Tian,et al.  Leave-one-out Bounds for Support Vector Regression , 2005, International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC'06).