An Incremental Learning Strategy for Support Vector Regression

Support vector machine (SVM) provides good generalization performance but suffers from a large amount of computation. This paper presents an incremental learning strategy for support vector regression (SVR). The new method firstly formulates an explicit expression of ||W||2 by constructing an orthogonal basis in feature space together with a basic Hilbert space identity, and then finds the regression function through minimizing the formula of ||W||2 rather than solving a convex programming problem. Particularly, we combine the minimization of ||W||2 with kernel selection that can lead to good generalization performance. The presented method not only provides a novel way for incremental SVR learning, but opens an opportunity for model selection of SVR as well. An artificial data set, a benchmark data set and a real-world data set are employed to evaluate the method. The simulations support the feasibility and effectiveness of the proposed approach.

[1]  John Platt,et al.  Fast training of svms using sequential minimal optimization , 1998 .

[2]  Nello Cristianini,et al.  Query Learning with Large Margin Classi ersColin , 2000 .

[3]  F. Girosi,et al.  Networks for approximation and learning , 1990, Proc. IEEE.

[4]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[5]  Gert Cauwenberghs,et al.  Incremental and Decremental Support Vector Machine Learning , 2000, NIPS.

[6]  D Haussler,et al.  Knowledge-based analysis of microarray gene expression data by using support vector machines. , 2000, Proceedings of the National Academy of Sciences of the United States of America.

[7]  S. Gunn Support Vector Machines for Classification and Regression , 1998 .

[8]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[9]  Harris Drucker,et al.  Support vector machines for spam categorization , 1999, IEEE Trans. Neural Networks.

[10]  Vladimir Vapnik,et al.  The Nature of Statistical Learning , 1995 .

[11]  Nello Cristianini,et al.  An introduction to Support Vector Machines , 2000 .

[12]  Nello Cristianini,et al.  The Kernel-Adatron Algorithm: A Fast and Simple Learning Procedure for Support Vector Machines , 1998, ICML.

[13]  Olivier Chapelle,et al.  Model Selection for Support Vector Machines , 1999, NIPS.

[14]  Alexander J. Smola,et al.  Support Vector Method for Function Approximation, Regression Estimation and Signal Processing , 1996, NIPS.

[15]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[16]  Victor L. Brailovsky,et al.  On domain knowledge and feature selection using a support vector machine , 1999, Pattern Recognit. Lett..

[17]  Federico Girosi,et al.  An improved training algorithm for support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[18]  Thorsten Joachims,et al.  Text Categorization with Support Vector Machines: Learning with Many Relevant Features , 1998, ECML.

[19]  Zongben Xu,et al.  Three improved neural network models for air quality forecasting , 2003 .

[20]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[21]  Thorsten Joachims,et al.  Making large-scale support vector machine learning practical , 1999 .

[22]  Mario Martín Muñoz On-line support vector machines for function approximation , 2002 .

[23]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[24]  Katya Scheinberg,et al.  Incremental Learning and Selective Sampling via Parametric Optimization Framework for SVM , 2001, NIPS.