A Note on the Decomposition Methods for Support Vector Regression

The dual formulation of support vector regression involves two closely related sets of variables. When the decomposition method is used, many existing approaches use pairs of indices from these two sets as the working set. Basically, they select a base set first and then expand it so all indices are pairs. This makes the implementation different from that for support vector classification. In addition, a larger optimization subproblem has to be solved in each iteration. We provide theoretical proofs and conduct experiments to show that using the base set as the working set leads to similar convergence (number of iterations). Therefore, by using a smaller working set while keeping a similar number of iterations, the program can be simpler and more efficient.

[1]  J. Freidman,et al.  Multivariate adaptive regression splines , 1991 .

[2]  Christopher J. Merz,et al.  UCI Repository of Machine Learning Databases , 1996 .

[3]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[4]  Thorsten Joachims,et al.  Making large scale SVM learning practical , 1998 .

[5]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[6]  Pavel Laskov,et al.  An Improved Decomposition Algorithm for Regression Support Vector Machines , 1999, NIPS.

[7]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[8]  B. Schölkopf,et al.  Advances in kernel methods: support vector learning , 1999 .

[9]  S. Sathiya Keerthi,et al.  A fast iterative nearest point algorithm for support vector machine classifier design , 2000, IEEE Trans. Neural Networks Learn. Syst..

[10]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[11]  S. Sathiya Keerthi,et al.  Improvements to the SMO algorithm for SVM regression , 2000, IEEE Trans. Neural Networks Learn. Syst..

[12]  Chih-Jen Lin,et al.  On the convergence of the decomposition method for support vector machines , 2001, IEEE Trans. Neural Networks.

[13]  Chih-Jen Lin Stopping Criteria of Decomposition Methods for Support Vector Machines: a Theoretical Justification , 2001 .

[14]  Chih-Jen Lin,et al.  A Note on the Decomposition Methods for Support Vector Regression , 2001, Neural Computation.

[15]  S. Sathiya Keerthi,et al.  Improvements to Platt's SMO Algorithm for SVM Classifier Design , 2001, Neural Computation.

[16]  Samy Bengio,et al.  SVMTorch: Support Vector Machines for Large-Scale Regression Problems , 2001, J. Mach. Learn. Res..

[17]  Chih-Jen Lin,et al.  A formal analysis of stopping criteria of decomposition methods for support vector machines , 2002, IEEE Trans. Neural Networks.

[18]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[19]  Gary William Flake,et al.  Efficient SVM Regression Training with SMO , 2002, Machine Learning.

[20]  Pavel Laskov,et al.  Feasible Direction Decomposition Algorithms for Training Support Vector Machines , 2002, Machine Learning.