Working Set Selection Using Second Order Information for Training Support Vector Machines

Working set selection is an important step in decomposition methods for training support vector machines (SVMs). This paper develops a new technique for working set selection in SMO-type decomposition methods. It uses second order information to achieve fast convergence. Theoretical properties such as linear convergence are established. Experiments demonstrate that the proposed method is faster than existing selection methods using first order information.

[1]  Bernhard E. Boser,et al.  A training algorithm for optimal margin classifiers , 1992, COLT '92.

[2]  Michael T. Manry,et al.  Automatic recognition of USGS land use/cover categories using statistical and neural network classifiers , 1993, Defense, Security, and Sensing.

[3]  David J. Spiegelhalter,et al.  Machine Learning, Neural and Statistical Classification , 2009 .

[4]  Tin Kam Ho,et al.  Building projectable classifiers of arbitrary complexity , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[5]  Federico Girosi,et al.  Training support vector machines: an application to face detection , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[6]  D. J. Newman,et al.  UCI Repository of Machine Learning Database , 1998 .

[7]  Thorsten Joachims,et al.  Making large scale SVM learning practical , 1998 .

[8]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[9]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[10]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[11]  Chih-Jen Lin,et al.  On the convergence of the decomposition method for support vector machines , 2001, IEEE Trans. Neural Networks.

[12]  Chih-Jen Lin Linear Convergence of a Decomposition Method for Support Vector Machines , 2001 .

[13]  S. Sathiya Keerthi,et al.  Improvements to Platt's SMO Algorithm for SVM Classifier Design , 2001, Neural Computation.

[14]  Chih-Jen Lin,et al.  Asymptotic convergence of an SMO algorithm without any assumptions , 2002, IEEE Trans. Neural Networks.

[15]  M. Palaniswami,et al.  Increasing the step of the Newtonian Decomposition Method for Support Vector Machines , 2003 .

[16]  Don R. Hush,et al.  Polynomial-Time Decomposition Algorithms for Support Vector Machines , 2003, Machine Learning.

[17]  Hans Ulrich Simon,et al.  On the complexity of working set selection , 2007, Theor. Comput. Sci..

[18]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[19]  Laura Palagi,et al.  On the convergence of a modified version of SVM light algorithm , 2005, Optim. Methods Softw..

[20]  Chih-Jen Lin,et al.  A Study on SMO-Type Decomposition Methods for Support Vector Machines , 2006, IEEE Transactions on Neural Networks.