Two-dimensional solution path for support vector regression

Recently, a very appealing approach was proposed to compute the entire solution path for support vector classification (SVC) with very low extra computational cost. This approach was later extended to a support vector regression (SVR) model called ε-SVR. However, the method requires that the error parameter ε be set a priori, which is only possible if the desired accuracy of the approximation can be specified in advance. In this paper, we show that the solution path for ε-SVR is also piecewise linear with respect to ε. We further propose an efficient algorithm for exploring the two-dimensional solution space defined by the regularization and error parameters. As opposed to the algorithm for SVC, our proposed algorithm for ε-SVR initializes the number of support vectors to zero and then increases it gradually as the algorithm proceeds. As such, a good regression function possessing the sparseness property can be obtained after only a few iterations.

[1]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[2]  Robert Tibshirani,et al.  1-norm Support Vector Machines , 2003, NIPS.

[3]  Ji Zhu,et al.  Computing the Solution Path for the Regularized Support Vector Regression , 2005, NIPS.

[4]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[5]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[6]  Robert Tibshirani,et al.  The Entire Regularization Path for the Support Vector Machine , 2004, J. Mach. Learn. Res..

[7]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[8]  Ji Zhu,et al.  Boosting as a Regularized Path to a Maximum Margin Classifier , 2004, J. Mach. Learn. Res..

[9]  S. Rosset,et al.  Piecewise linear regularized solution paths , 2007, 0708.2197.