A Note on the Decomposition Methods for Support Vector Regression

The dual formulation of support vector regression involves two closely related sets of variables. When the decomposition method is used, many existing approaches use pairs of indices from these two sets as the working set. Basically, they select a base set first and then expand it so all indices are pairs. This makes the implementation different from that for support vector classification. In addition, a larger optimization subproblem has to be solved in each iteration. We provide theoretical proofs and conduct experiments to show that using the base set as the working set leads to similar convergence (number of iterations). Therefore, by using a smaller working set while keeping a similar number of iterations, the program can be simpler and more efficient.

[1]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[2]  Chih-Jen Lin,et al.  On the convergence of the decomposition method for support vector machines , 2001, IEEE Trans. Neural Networks.

[3]  J. Friedman Multivariate adaptive regression splines , 1990 .

[4]  Pavel Laskov,et al.  An Improved Decomposition Algorithm for Regression Support Vector Machines , 1999, NIPS.

[5]  Gary William Flake,et al.  Efficient SVM Regression Training with SMO , 2002, Machine Learning.

[6]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[7]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[8]  Pavel Laskov,et al.  Feasible Direction Decomposition Algorithms for Training Support Vector Machines , 2002, Machine Learning.

[9]  Chih-Jen Lin Stopping Criteria of Decomposition Methods for Support Vector Machines: a Theoretical Justification , 2001 .

[10]  S. Sathiya Keerthi,et al.  A fast iterative nearest point algorithm for support vector machine classifier design , 2000, IEEE Trans. Neural Networks Learn. Syst..

[11]  Chih-Jen Lin,et al.  A note on the decomposition methods for support vector regression , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[12]  S. Sathiya Keerthi,et al.  Improvements to Platt's SMO Algorithm for SVM Classifier Design , 2001, Neural Computation.

[13]  Samy Bengio,et al.  SVMTorch: Support Vector Machines for Large-Scale Regression Problems , 2001, J. Mach. Learn. Res..

[14]  Thorsten Joachims,et al.  Making large scale SVM learning practical , 1998 .

[15]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[16]  B. Schölkopf,et al.  Advances in kernel methods: support vector learning , 1999 .

[17]  Christopher J. Merz,et al.  UCI Repository of Machine Learning Databases , 1996 .

[18]  Chih-Jen Lin,et al.  A formal analysis of stopping criteria of decomposition methods for support vector machines , 2002, IEEE Trans. Neural Networks.

[19]  S. Sathiya Keerthi,et al.  Improvements to the SMO algorithm for SVM regression , 2000, IEEE Trans. Neural Networks Learn. Syst..

[20]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .