Efficient Computation and Model Selection for the Support Vector Regression

In this letter, we derive an algorithm that computes the entire solution path of the support vector regression (SVR). We also propose an unbiased estimate for the degrees of freedom of the SVR model, which allows convenient selection of the regularization parameter.

[1]  Jianming Ye On Measuring and Correcting the Effects of Data Mining and Model Selection , 1998 .

[2]  R. Vanderbei LOQO:an interior point code for quadratic programming , 1999 .

[3]  Robert Tibshirani,et al.  The Entire Regularization Path for the Support Vector Machine , 2004, J. Mach. Learn. Res..

[4]  R. Tibshirani,et al.  On the “degrees of freedom” of the lasso , 2007, 0712.0881.

[5]  Ming-Wei Chang,et al.  Leave-One-Out Bounds for Support Vector Regression Model Selection , 2005, Neural Computation.

[6]  S. Sathiya Keerthi,et al.  Improvements to Platt's SMO Algorithm for SVM Classifier Design , 2001, Neural Computation.

[7]  Arthur E. Hoerl,et al.  Ridge Regression: Biased Estimation for Nonorthogonal Problems , 2000, Technometrics.

[8]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[9]  B. Schölkopf,et al.  Advances in kernel methods: support vector learning , 1999 .

[10]  A. E. Hoerl,et al.  Ridge Regression: Biased Estimation for Nonorthogonal Problems.: Biased Estimation for Nonorthogonal Problems. , 2000 .

[11]  M. Stone Cross‐Validatory Choice and Assessment of Statistical Predictions , 1976 .

[12]  Koby Crammer,et al.  A Temporal Kernel-Based Model for Tracking Hand Movements from Neural Activities , 2004, NIPS.

[13]  Mary C. Meyer,et al.  ON THE DEGREES OF FREEDOM IN SHAPE-RESTRICTED REGRESSION , 2000 .

[14]  Federico Girosi,et al.  An improved training algorithm for support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[15]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[16]  Gang Wang,et al.  A New Solution Path Algorithm in Support Vector Regression , 2008, IEEE Transactions on Neural Networks.

[17]  James Theiler,et al.  Accurate On-line Support Vector Regression , 2003, Neural Computation.

[18]  G. Wahba,et al.  Some results on Tchebycheffian spline functions , 1971 .

[19]  B. Efron How Biased is the Apparent Error Rate of a Prediction Rule , 1986 .

[20]  B. Efron The Estimation of Prediction Error , 2004 .

[21]  Gunnar Rätsch,et al.  Predicting Time Series with Support Vector Machines , 1997, ICANN.

[22]  Grace Wahba,et al.  Spline Models for Observational Data , 1990 .

[23]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[24]  Kiri Wagstaff,et al.  Alpha seeding for support vector machines , 2000, KDD '00.

[25]  J. Freidman,et al.  Multivariate adaptive regression splines , 1991 .

[26]  Eugene L. Allgower,et al.  Continuation and path following , 1993, Acta Numerica.

[27]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[28]  B. Silverman,et al.  Some Aspects of the Spline Smoothing Approach to Non‐Parametric Regression Curve Fitting , 1985 .

[29]  Jacek Gondzio,et al.  Reoptimization With the Primal-Dual Interior Point Method , 2002, SIAM J. Optim..

[30]  Alexander J. Smola,et al.  Support Vector Method for Function Approximation, Regression Estimation and Signal Processing , 1996, NIPS.

[31]  Peter Craven,et al.  Smoothing noisy data with spline functions , 1978 .

[32]  Thorsten Joachims,et al.  Making large scale SVM learning practical , 1998 .

[33]  Chih-Jen Lin,et al.  Training v-Support Vector Regression: Theory and Algorithms , 2002, Neural Computation.

[34]  C. Stein Estimation of the Mean of a Multivariate Normal Distribution , 1981 .

[35]  G. Wahba Smoothing noisy data with spline functions , 1975 .