Decremental multi-output least square SVR learning

The solution of multi-output LS-SVR machines follows from solving a set of linear equations. Compared with ε-intensive SVR, it loses the advantage of a sparse decomposition. In order to limit the number of support vectors and reduce the computation cost, this paper presents a decremental recursive algorithm for multi-output LS-SVR machines. This algorithm removes one sample one time and large-scale matrix inverse is computed quickly based on previous results. The decremental algorithm can be used to train online multi-output LS-SVR machine. Experimental results demonstrate the effectiveness of the algorithm.

[1]  JianghuaLiu,et al.  Online LS-SVM for function estimation and classification , 2003 .

[2]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[3]  Gert Cauwenberghs,et al.  Incremental and Decremental Support Vector Machine Learning , 2000, NIPS.

[4]  Zhang Hao,et al.  Incremental and Online Learning Algorithm for Regression Least Squares Support Vector Machine , 2006 .

[5]  José M. Matías Multi-output Nonparametric Regression , 2005, EPIA.

[6]  Johan A. K. Suykens,et al.  Sparse approximation using least squares support vector machines , 2000, 2000 IEEE International Symposium on Circuits and Systems. Emerging Technologies for the 21st Century. Proceedings (IEEE Cat No.00CH36353).