Efficient reduction of support vectors in kernel-based methods

Kernel-based methods, e.g., support vector machine (SVM), produce high classification performances. However, the computation becomes time-consuming as the number of the vectors supporting the classifier increases. In this paper, we propose a method for reducing the computational cost of classification by kernel-based methods while retaining the high performance. By using linear algebra of a kernel Gram matrix of the support vectors (SVs) at low computational cost, the method efficiently prunes the redundant SVs which are unnecessary for constructing the classifier. The pruning is based on the evaluation of the performance of the classifier formed by the reduced SVs in SVM. In the experiment of classification using SVM for various datasets, the feasibility of the evaluation criterion and the effectiveness of the proposed method are demonstrated.

[1]  Ingo Steinwart,et al.  Sparseness of Support Vector Machines , 2003, J. Mach. Learn. Res..

[2]  B. Scholkopf,et al.  Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[3]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[4]  Jason Weston,et al.  Large-scale kernel machines , 2007 .

[5]  Tom Downs,et al.  Exact Simplification of Support Vector Solutions , 2002, J. Mach. Learn. Res..

[6]  Yann LeCun,et al.  The mnist database of handwritten digits , 2005 .

[7]  Anthony Widjaja,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.

[8]  Gunnar Rätsch,et al.  Input space versus feature space in kernel-based methods , 1999, IEEE Trans. Neural Networks.

[9]  Bernhard Schölkopf,et al.  Improving the accuracy and speed of support vector learning machines , 1997, NIPS 1997.

[10]  Chih-Jen Lin,et al.  A study on reduced support vector machines , 2003, IEEE Trans. Neural Networks.