A new Support Vector classification algorithm with parametric-margin model

In this paper, a new algorithm for Support Vector classification is described. It is shown how to use the parametric margin model with non-constant radius. This is useful in many cases, especially when the noise is heteroscedastic, that is, where it depends on x. Moreover, for a priori chosen v, the proposed new SV classification algorithm has advantage of using the parameter 0 les v les 1 on controlling the number of support vectors. To be more precise, v is an upper bound on the fraction of margin errors and a lower bound of the fraction of support vectors. Hence, the selection of v is more intuitive. The algorithm is analyzed theoretically and experimentally.

[1]  Shrinking the Tube: A New Support Vector Regression Algorithm with Parametric Insensitive Model , 2007, 2007 International Conference on Machine Learning and Cybernetics.

[2]  John Hallam,et al.  IEEE International Joint Conference on Neural Networks , 2005 .

[3]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[4]  Chih-Jen Lin,et al.  A comparison of methods for multiclass support vector machines , 2002, IEEE Trans. Neural Networks.

[5]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[6]  V. Vapnik Estimation of Dependences Based on Empirical Data , 2006 .

[7]  Sayan Mukherjee,et al.  Choosing Multiple Parameters for Support Vector Machines , 2002, Machine Learning.

[8]  Federico Girosi,et al.  Training support vector machines: an application to face detection , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[9]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[10]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.