Householder transformation based sparse least squares support vector regression

Sparseness is a key problem in modeling problems. To sparsify the solution of normal least squares support vector regression (LSSVR), a novel sparse method is proposed in this paper, which recruits support vectors sequentially by virtue of Householder transformation, here HSLSSVR for short. In HSLSSVR, there are two benefits. On one hand, a recursive strategy is adopted to solve the linear equation set instead of solving it from scratch. During each iteration, the training sample incurring the maximum reduction on the residuals is recruited as support vector. On the other hand, in the process of solving the linear equation set, its condition number does not deteriorate, so the numerical stability is guaranteed. The reports from experiments on benchmark data sets and a real-world mechanical system to calculate the inverse dynamics of a robot arm demonstrate the effectiveness and feasibility of the proposed HSLSSVR.

[1]  Sheng Chen,et al.  Orthogonal least squares methods and their application to non-linear system identification , 1989 .

[2]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[3]  Johan A. K. Suykens,et al.  Sparse approximation using least squares support vector machines , 2000, 2000 IEEE International Symposium on Circuits and Systems. Emerging Technologies for the 21st Century. Proceedings (IEEE Cat No.00CH36353).

[4]  Kwoh Chee Keong,et al.  Fast leave-one-out evaluation and improvement on inference for LS-SVMs , 2004, ICPR 2004.

[5]  Vladimir Vapnik,et al.  An overview of statistical learning theory , 1999, IEEE Trans. Neural Networks.

[6]  Johan A. K. Suykens,et al.  Optimized fixed-size kernel models for large data sets , 2010, Comput. Stat. Data Anal..

[7]  Su-Yun Huang,et al.  Reduced Support Vector Machines: A Statistical Theory , 2007, IEEE Transactions on Neural Networks.

[8]  Sheng Chen,et al.  A New RBF Neural Network With Boundary Value Constraints , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[9]  Alston S. Householder,et al.  Unitary Triangularization of a Nonsymmetric Matrix , 1958, JACM.

[10]  Yuh-Jye Lee,et al.  RSVM: Reduced Support Vector Machines , 2001, SDM.

[11]  Anthony Kuh,et al.  Comments on "Pruning Error Minimization in Least Squares Support Vector Machines" , 2007, IEEE Trans. Neural Networks.

[12]  Augustin A. Dubrulle Householder Transformations Revisited , 2000, SIAM J. Matrix Anal. Appl..

[13]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[14]  S. Balasundaram,et al.  Lagrangian support vector regression via unconstrained convex minimization , 2014, Neural Networks.

[15]  Chih-Jen Lin,et al.  A study on reduced support vector machines , 2003, IEEE Trans. Neural Networks.

[16]  Johan A. K. Suykens,et al.  Load Forecasting Using Fixed-Size Least Squares Support Vector Machines , 2005, IWANN.

[17]  Xiang-Yan Zeng,et al.  SMO-based pruning methods for sparse least squares support vector machines , 2005, IEEE Transactions on Neural Networks.

[18]  Licheng Jiao,et al.  Fast Sparse Approximation for Least Squares Support Vector Machine , 2007, IEEE Transactions on Neural Networks.

[19]  Johan A. K. Suykens,et al.  Weighted least squares support vector machines: robustness and sparse approximation , 2002, Neurocomputing.

[20]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machines , 2002 .

[21]  Johan A. K. Suykens,et al.  Sparse Reductions for Fixed-Size Least Squares Support Vector Machines on Large Scale Data , 2013, PAKDD.

[22]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[23]  George W. Irwin,et al.  A Novel Sparse Least Squares Support Vector Machines , 2013 .

[24]  Senjian An,et al.  Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression , 2007, Pattern Recognit..

[25]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..