Nonlinear regularization path for the modified Huber loss Support Vector Machines

Regularization path algorithms have been proposed to deal with model selection problem in several machine learning approaches. These algorithms allow to compute the entire path of solutions for every value of regularization parameter using the fact that their solution paths have piecewise linear form. In this paper, we propose nonlinear regularization path for the Support Vector Machine (SVM) with a modified Huber loss. We first show that the solution path of the modified Huber loss SVM is represented as piecewise nonlinear function. Since the solutions between two breakpoints are characterized by a rational function, the breakpoint itself can be identified solving the rational equations. Then we develop an efficient iterative algorithm to solve these rational equations with quadratic convergence rate. Note that our algorithm is NOT a predictor-corrector type method that can only follow nonlinear regularization path with rough approximation. We show the algorithm performance on some artificial and real data sets

[1]  Gang Wang,et al.  A kernel path algorithm for support vector machines , 2007, ICML '07.

[2]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[3]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[4]  Robert Tibshirani,et al.  The Entire Regularization Path for the Support Vector Machine , 2004, J. Mach. Learn. Res..

[5]  J. Bunch,et al.  Rank-one modification of the symmetric eigenproblem , 1978 .

[6]  G. Golub Some modified eigenvalue problems , 1971 .

[7]  J. Weston,et al.  Support Vector Machine Solvers , 2007 .

[8]  Michael I. Jordan,et al.  Computing regularization paths for learning multiple kernels , 2004, NIPS.

[9]  Tong Zhang Statistical behavior and consistency of classification methods based on convex risk minimization , 2003 .

[10]  Ji Zhu,et al.  Computing the Solution Path for the Regularized Support Vector Regression , 2005, NIPS.

[11]  Olivier Chapelle,et al.  Training a Support Vector Machine in the Primal , 2007, Neural Computation.

[12]  Licheng Jiao,et al.  Recursive Finite Newton Algorithm for Support Vector Regression in the Primal , 2007, Neural Computation.

[13]  K. Ritter On Parametric Linear and Quadratic Programming Problems. , 1981 .

[14]  S. Rosset,et al.  Piecewise linear regularized solution paths , 2007, 0708.2197.

[15]  Wei Chu,et al.  Bayesian support vector regression using a unified loss function , 2004, IEEE Transactions on Neural Networks.

[16]  Jack Dongarra,et al.  LAPACK Users' guide (third ed.) , 1999 .

[17]  Michael Vogt,et al.  SMO Algorithms for Support Vector Machines without Bias Term , 2002 .

[18]  Eugene L. Allgower,et al.  Continuation and path following , 1993, Acta Numerica.

[19]  Saharon Rosset,et al.  Following Curved Regularized Optimization Solution Paths , 2004, NIPS.

[20]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[21]  A. d'Aspremont,et al.  A Pathwise Algorithm for Covariance Selection , 2009, 0908.0143.

[22]  Ambuj Tewari,et al.  Sparseness vs Estimating Conditional Probabilities: Some Asymptotic Results , 2007, J. Mach. Learn. Res..

[24]  S. Sathiya Keerthi,et al.  A Modified Finite Newton Method for Fast Solution of Large Scale Linear SVMs , 2005, J. Mach. Learn. Res..

[25]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[26]  Mee Young Park,et al.  L1‐regularization path algorithm for generalized linear models , 2007 .

[27]  Robert H. Halstead,et al.  Matrix Computations , 2011, Encyclopedia of Parallel Computing.

[28]  Gang Wang,et al.  A New Solution Path Algorithm in Support Vector Regression , 2008, IEEE Transactions on Neural Networks.