A New Solution Path Algorithm in Support Vector Regression

In this paper, regularization path algorithms were proposed as a novel approach to the model selection problem by exploring the path of possibly all solutions with respect to some regularization hyperparameter in an efficient way. This approach was later extended to a support vector regression (SVR) model called epsiv -SVR. However, the method requires that the error parameter epsiv be set a priori. This is only possible if the desired accuracy of the approximation can be specified in advance. In this paper, we analyze the solution space for epsiv-SVR and propose a new solution path algorithm, called epsiv-path algorithm, which traces the solution path with respect to the hyperparameter epsiv rather than lambda. Although both two solution path algorithms possess the desirable piecewise linearity property, our epsiv-path algorithm overcomes some limitations of the original lambda-path algorithm and has more advantages. It is thus more appealing for practical use.

[1]  Ji Zhu,et al.  Efficient Computation and Model Selection for the Support Vector Regression , 2007, Neural Computation.

[2]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[3]  Matthias W. Seeger,et al.  Using the Nyström Method to Speed Up Kernel Machines , 2000, NIPS.

[4]  Gang Wang,et al.  A kernel path algorithm for support vector machines , 2007, ICML '07.

[5]  Federico Girosi,et al.  An improved training algorithm for support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[6]  Michael I. Jordan,et al.  Computing regularization paths for learning multiple kernels , 2004, NIPS.

[7]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[8]  E. Allgower,et al.  Numerical Continuation Methods , 1990 .

[9]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[10]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[11]  S. Rosset,et al.  Piecewise linear regularized solution paths , 2007, 0708.2197.

[12]  R. Tibshirani,et al.  On the “degrees of freedom” of the lasso , 2007, 0712.0881.

[13]  Thorsten Joachims,et al.  Making large-scale support vector machine learning practical , 1999 .

[14]  S. Sathiya Keerthi,et al.  Improvements to Platt's SMO Algorithm for SVM Classifier Design , 2001, Neural Computation.

[16]  Peter Craven,et al.  Smoothing noisy data with spline functions , 1978 .

[17]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[18]  Gang Wang,et al.  The Kernel Path in Kernelized LASSO , 2007, AISTATS.

[19]  Ji Zhu,et al.  Boosting as a Regularized Path to a Maximum Margin Classifier , 2004, J. Mach. Learn. Res..

[20]  Saharon Rosset,et al.  Following Curved Regularized Optimization Solution Paths , 2004, NIPS.

[21]  Robert Tibshirani,et al.  The Entire Regularization Path for the Support Vector Machine , 2004, J. Mach. Learn. Res..

[22]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[23]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[24]  J. Freidman,et al.  Multivariate adaptive regression splines , 1991 .

[25]  Robert Tibshirani,et al.  1-norm Support Vector Machines , 2003, NIPS.

[26]  Bernhard Schölkopf,et al.  Sparse Greedy Matrix Approximation for Machine Learning , 2000, International Conference on Machine Learning.

[27]  G. Wahba Smoothing noisy data with spline functions , 1975 .