A novel and quick SVM-based multi-class classifier

Use different real positive numbers p"i to represent all kinds of pattern categories, after mapping the inputted patterns into a special feature space by a non-linear mapping, a linear relation between the mapped patterns and numbers p"i is assumed, whose bias and coefficients are undetermined, and the hyper-plane corresponding to zero output of the linear relation is looked as the base hyper-plane. To determine the pending parameters, an objective function is founded aiming to minimize the difference between the outputs of the patterns belonging to a same type and the corresponding p"i, and to maximize the distance between any two different hyper-planes corresponding to different pattern types. The objective function is same to that of support vector regression in form, so the coefficients and bias of the linear relation are calculated by some known methods such as SVM^l^i^g^h^t approach. Simultaneously, three methods are also given to determine p"i, the best one is to determine them in training process, which has relatively high accuracy. Experiment results of the IRIS data set show that, the accuracy of this method is better than those of many SVM-based multi-class classifiers, and close to that of DAGSVM (decision-directed acyclic graph SVM), emphatically, the recognition speed is the highest.

[1]  Ethem Alpaydin,et al.  Support Vector Machines for Multi-class Classification , 1999, IWANN.

[2]  S. Sathiya Keerthi,et al.  Which Is the Best Multiclass SVM Method? An Empirical Study , 2005, Multiple Classifier Systems.

[3]  Nello Cristianini,et al.  Large Margin DAGs for Multiclass Classification , 1999, NIPS.

[4]  Minoru Maruyama,et al.  A method to make multiple hypotheses with high cumulative recognition rate using SVMs , 2004, Pattern Recognit..

[5]  James A. Bucklew,et al.  Support vector machines and the multiple hypothesis test problem , 2001, IEEE Trans. Signal Process..

[6]  Andreu Català,et al.  Rapid and brief communication: Unified dual for bi-class SVM approaches , 2005 .

[8]  Johan A. K. Suykens,et al.  Multiclass LS-SVMs: Moderated Outputs and Coding-Decoding Schemes , 2002, Neural Processing Letters.

[9]  Wesley E. Snyder,et al.  A comparative analysis of structural risk minimization by support vector machines and nearest neighbor rule , 2004, Pattern Recognit. Lett..

[10]  E Comak,et al.  A support vector machine using the lazy learning approach for multi-class classification , 2006, Journal of medical engineering & technology.

[11]  Andreu Català,et al.  K-SVCR. A support vector machine for multi-class classification , 2003, Neurocomputing.

[12]  D. Anguita,et al.  A new method for multiclass support vector machines , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[13]  Chih-Jen Lin,et al.  A comparison of methods for multiclass support vector machines , 2002, IEEE Trans. Neural Networks.

[14]  G. Wahba,et al.  Multicategory Support Vector Machines , Theory , and Application to the Classification of Microarray Data and Satellite Radiance Data , 2004 .

[15]  LungShung-Yung Rapid and brief communication , 2007 .

[16]  Junbin Gao,et al.  Mean field method for the support vector machine regression , 2003, Neurocomputing.

[17]  Alexander J. Smola,et al.  Support Vector Method for Function Approximation, Regression Estimation and Signal Processing , 1996, NIPS.

[18]  Kristin P. Bennett,et al.  Multicategory Classification by Support Vector Machines , 1999, Comput. Optim. Appl..

[19]  Thorsten Joachims,et al.  Making large scale SVM learning practical , 1998 .

[20]  Wenjian Wang,et al.  A heuristic training for support vector regression , 2004, Neurocomputing.

[21]  A. V.DavidSánchez,et al.  Advanced support vector machines and kernel methods , 2003, Neurocomputing.