Improvements to Bennett?s Nearest Point Algorithm for Support Vector Machines

Intuitive geometric interpretation for Support Vector Machines (SVM) provides an alternative way to implement SVM. Although Bennet’s nearest point algorithm (NPA) can deal with reduced convex hulls, it has some disadvantages. In the paper, a feasible direction explanation for NPA is proposed so that computation of kernel can be reduced greatly. Besides, the original NPA is extended to handle the arbitrary valid value of μ, therefore a better generalization performance may be obtained.

[1]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[2]  Christopher J. C. Burges,et al.  A Tutorial on Support Vector Machines for Pattern Recognition , 1998, Data Mining and Knowledge Discovery.

[3]  David Eppstein,et al.  Optimization over Zonotopes and Training Support Vector Machines , 2001, WADS.

[4]  Niklaus Wirth,et al.  Algorithms and Data Structures , 1989, Lecture Notes in Computer Science.

[5]  Kristin P. Bennett,et al.  Duality, Geometry, and Support Vector Regression , 2001, NIPS.

[6]  Jinbo Bi,et al.  A geometric approach to support vector regression , 2003, Neurocomputing.

[7]  Kristin P. Bennett,et al.  Duality and Geometry in SVM Classifiers , 2000, ICML.

[8]  Narendra Ahuja,et al.  A geometric approach to train support vector machines , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[9]  S. Sathiya Keerthi,et al.  A fast iterative nearest point algorithm for support vector machine classifier design , 2000, IEEE Trans. Neural Networks Learn. Syst..