A regression approach to LS-SVM and sparse realization based on fast subset selection

The Least Squares Support Vector Machine (LS-SVM) is a modified SVM with a ridge regression cost function and equality constraints. It has been successfully applied in many classification problems. But, the common issue for LS-SVM is that it lacks sparseness, which is a serious drawback in its applications. To tackle this problem, a fast approach is proposed in this paper for developing sparse LS-SVM. First, a new regression solution is proposed for the LS-SVM which optimizes the same objective function for the conventional solution. Based on this, a new subset selection method is then adopted to realize the sparse approximation. Simulation results on different benchmark datasets i.e. Checkerboard, two Gaussian datasets, show that the proposed solution can achieve better objective value than conventional LS-SVM, and the proposed approach can achieve a more sparse LS-SVM than the conventional LS-SVM while provide comparable predictive classification accuracy. Additionally, the computational complexity is significantly decreased.

[1]  G. Horvath,et al.  A sparse least squares support vector machine classifier , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[2]  Johan A. K. Suykens,et al.  Weighted least squares support vector machines: robustness and sparse approximation , 2002, Neurocomputing.

[3]  K. Johana,et al.  Benchmarking Least Squares Support Vector Machine Classifiers , 2022 .

[4]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[5]  George W. Irwin,et al.  A fast nonlinear model identification method , 2005, IEEE Transactions on Automatic Control.

[6]  C. Berg,et al.  Harmonic Analysis on Semigroups , 1984 .

[7]  Johan A. K. Suykens,et al.  Sparse approximation using least squares support vector machines , 2000, 2000 IEEE International Symposium on Circuits and Systems. Emerging Technologies for the 21st Century. Proceedings (IEEE Cat No.00CH36353).

[8]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[9]  M. Omair Ahmad,et al.  Optimizing the kernel in the empirical feature space , 2005, IEEE Transactions on Neural Networks.

[10]  Shigeo Abe,et al.  Sparse least squares support vector training in the reduced empirical feature space , 2007, Pattern Analysis and Applications.

[11]  Licheng Jiao,et al.  Fast Sparse Approximation for Least Squares Support Vector Machine , 2007, IEEE Transactions on Neural Networks.