Regularization Path for $\nu$ -Support Vector Classification

The v-support vector classification (v-SVC) proposed by Schölkopf has the advantage of using a regularization parameter v for controlling the number of support vectors and margin errors. However, compared to C-SVC, its formulation is more complicated, and to date there are no effective methods for computing its regularization path. In this paper, we propose a new regularization path algorithm, which is designed on the basis of a modified formulation of v-SVC and traces the solution path with respect to the parameter v. Through theoretical analysis and confirmatory experiments, we show that our algorithm can avoid the infeasible updating path under several assumptions (i.e., Assumptions 1 and 2), and fit the entire solution path in a finite number of steps. When the regularization path of v-SVC is available, a novel approach proposed by Yang and Ong can be applied to obtain the global optimal solution of common validation functions for v-SVC, and the computation for the whole process is minimal. Numerical experiments show that it is more efficient than various kinds of grid search methods for selecting the optimal regularization parameter v.

[1]  Alston S. Householder,et al.  The Theory of Matrices in Numerical Analysis , 1964 .

[2]  Ron Kohavi,et al.  A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection , 1995, IJCAI.

[3]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[4]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[5]  Chih-Jen Lin,et al.  On the convergence of the decomposition method for support vector machines , 2001, IEEE Trans. Neural Networks.

[6]  Chih-Jen Lin,et al.  Training ν-Support Vector Classifiers: Theory and Algorithms , 2001 .

[7]  Anthony Widjaja,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.

[8]  Ingo Steinwart,et al.  On the Optimal Parameter Choice for v-Support Vector Machines , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Robert Tibshirani,et al.  The Entire Regularization Path for the Support Vector Machine , 2004, J. Mach. Learn. Res..

[10]  S. Sathiya Keerthi,et al.  An Efficient Method for Gradient-Based Adaptation of Hyperparameters in SVM Models , 2006, NIPS.

[11]  Eric Horvitz,et al.  Considering Cost Asymmetry in Learning Classifiers , 2006, J. Mach. Learn. Res..

[12]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[13]  Klaus-Robert Müller,et al.  Incremental Support Vector Learning: Analysis, Implementation and Applications , 2006, J. Mach. Learn. Res..

[14]  Ji Zhu,et al.  Efficient Computation and Model Selection for the Support Vector Regression , 2007, Neural Computation.

[15]  Gang Wang,et al.  A kernel path algorithm for support vector machines , 2007, ICML '07.

[16]  Wei Chu,et al.  Support Vector Ordinal Regression , 2007, Neural Computation.

[17]  Gang Wang,et al.  A New Solution Path Algorithm in Support Vector Regression , 2008, IEEE Transactions on Neural Networks.

[18]  Bin Gu,et al.  On-line off-line Ranking Support Vector Machine and analysis , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[19]  Saharon Rosset Bi-level path following for cross validated solution of kernel quantile regression , 2008, ICML '08.

[20]  Takafumi Kanamori,et al.  Nonparametric Conditional Density Estimation Using Piecewise-Linear Solution Path of Kernel Quantile Regression , 2009, Neural Computation.

[21]  Deyu Meng,et al.  Fast and Efficient Strategies for Model Selection of Gaussian Support Vector Machine , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[22]  Takafumi Kanamori,et al.  A Least-squares Approach to Direct Importance Estimation , 2009, J. Mach. Learn. Res..

[23]  Jiandong Wang,et al.  An Effective Regularization Path for ν-Support Vector Classification , 2009, 2009 Third International Symposium on Intelligent Information Technology Application.

[24]  Richard G. Baraniuk,et al.  Tuning Support Vector Machines for Minimax and Neyman-Pearson Classification , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[25]  Jianbo Yang,et al.  An Improved Algorithm for the Solution of the Regularization Path of Support Vector Machine , 2010, IEEE Transactions on Neural Networks.

[26]  Davide Anguita,et al.  Using Unsupervised Analysis to Constrain Generalization Bounds for Support Vector Classifiers , 2010, IEEE Transactions on Neural Networks.

[27]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[28]  R. Tibshirani,et al.  The solution path of the generalized lasso , 2010, 1005.1971.

[29]  Jian-Bo Yang,et al.  Determination of Global Minima of Some Common Validation Functions in Support Vector Machine , 2011, IEEE Transactions on Neural Networks.