A novel smoothing 1-norm SVM for classification and regression

The standard 2-norm support vector machine (SVM for short) is known for its good performance in classification and regression problems. In this paper, the 1-norm support vector machine is considered and a novel smoothing function method for Support Vector Classification(SVC) and Regression (SVR) are proposed in an attempt to overcome some drawbacks of the former methods which are complex, subtle, and sometimes difficult to implement. First, using Karush-Kuhn-Tucker complementary condition in optimization theory, unconstrained non-differentiable optimization model is built. Then the smooth approximation algorithm basing on differentiable function is given. Finally, the paper trains the data sets with standard unconstraint optimization method. This algorithm is fast and insensitive to the initial point. Theory analysis and numerical results illustrate that the smoothing function method for SVMs are feasible and effective.

[1]  H. Zou,et al.  The F ∞ -norm support vector machine , 2008 .

[2]  Robert Tibshirani,et al.  1-norm Support Vector Machines , 2003, NIPS.

[3]  Glenn Fung,et al.  A Feature Selection Newton Method for Support Vector Machine Classification , 2004, Comput. Optim. Appl..

[4]  Yuh-Jye Lee,et al.  SSVM: A Smooth Support Vector Machine for Classification , 2001, Comput. Optim. Appl..

[5]  David R. Musicant,et al.  Successive overrelaxation for support vector machines , 1999, IEEE Trans. Neural Networks.

[6]  Paul S. Bradley,et al.  Feature Selection via Concave Minimization and Support Vector Machines , 1998, ICML.

[7]  S. Sathiya Keerthi,et al.  Improvements to the SMO algorithm for SVM regression , 2000, IEEE Trans. Neural Networks Learn. Syst..

[8]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[9]  S. Keerthi,et al.  Improvements to SMO Algorithm for SVM Regression 1 , 1999 .

[10]  Vladimir Vapnik,et al.  An overview of statistical learning theory , 1999, IEEE Trans. Neural Networks.

[11]  Trevor Hastie,et al.  Discussion of Boosting Papers , 2003 .

[12]  R. Fletcher Practical Methods of Optimization , 1988 .

[13]  C. Grossmann,et al.  Fletcher, R., Practical Methods of Optimization, Constrained Optimization. Chichester-New York, Brisbane-Toronto, John Wiley & Sons 1981. VIII, 224 S., £ 13.30. ISBN 0-471-27828-9 , 1982 .

[14]  Hui Zou An Improved 1-norm SVM for Simultaneous Classification and Variable Selection , 2007, AISTATS.

[15]  Federico Girosi,et al.  An improved training algorithm for support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[16]  Yufeng Liu,et al.  Support vector machines with adaptive Lq penalty , 2007, Comput. Stat. Data Anal..

[17]  Alexander J. Smola,et al.  Advances in Large Margin Classifiers , 2000 .

[18]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[19]  Olvi L. Mangasarian,et al.  Exact 1-Norm Support Vector Machines Via Unconstrained Convex Differentiable Minimization , 2006, J. Mach. Learn. Res..