An incremental learning algorithm for Lagrangian support vector machines

Incremental learning has attracted more and more attention recently, both in theory and application. In this paper, the incremental learning algorithms for Lagrangian support vector machine (LSVM) are proposed. LSVM is an improvement to the standard linear SVM for classifications, which leads to the minimization of an unconstrained differentiable convex programming. The solution to this programming is obtained by an iteration scheme with a simple linear convergence. The inversion of the matrix in the solving algorithm is converted to the order of the original input space's dimensionality plus one at the beginning of the algorithm. The algorithm uses the Sherman-Morrison-Woodbury identity to reduce the computation time. The incremental learning algorithms for LSVM presented in this paper include two cases that are namely online and batch incremental learning. Because the inversion of the matrix after increment is solved based on the previous computed information, it is unnecessary to repeat the computing process. Experimental results show that the algorithms are superior to others.

[1]  Thorsten Joachims,et al.  Detecting Concept Drift with Support Vector Machines , 2000, ICML.

[2]  Frank Y. Shih,et al.  An improved incremental training algorithm for support vector machines using active query , 2007, Pattern Recognit..

[3]  David R. Musicant,et al.  Active Support Vector Machine Classification , 2000, NIPS.

[4]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[5]  Stefan Rüping,et al.  Incremental Learning with Support Vector Machines , 2001, ICDM.

[6]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[7]  Christopher J. C. Burges,et al.  A Tutorial on Support Vector Machines for Pattern Recognition , 1998, Data Mining and Knowledge Discovery.

[8]  Gene H. Golub,et al.  Matrix computations , 1983 .

[9]  Paul E. Utgoff,et al.  An Improved Algorithm for Incremental Induction of Decision Trees , 1994, ICML.

[10]  Michael C. Ferris,et al.  Interior-Point Methods for Massive Support Vector Machines , 2002, SIAM J. Optim..

[11]  David R. Musicant,et al.  Lagrangian Support Vector Machines , 2001, J. Mach. Learn. Res..

[12]  Gert Cauwenberghs,et al.  Incremental and Decremental Support Vector Machine Learning , 2000, NIPS.

[13]  Glenn Fung,et al.  Finite Newton method for Lagrangian support vector machine classification , 2003, Neurocomputing.

[14]  Olvi L. Mangasarian,et al.  Generalized Support Vector Machines , 1998 .

[15]  Glenn Fung,et al.  Data selection for support vector machine classifiers , 2000, KDD '00.

[16]  Huan Liu,et al.  Handling concept drifts in incremental learning with support vector machines , 1999, KDD '99.

[17]  Olvi L. Mangasarian,et al.  Nonlinear complementarity as unconstrained and constrained minimization , 1993, Math. Program..

[18]  Vladimir Cherkassky,et al.  Learning from Data: Concepts, Theory, and Methods , 1998 .

[19]  Alexander J. Smola,et al.  Advances in Large Margin Classifiers , 2000 .

[20]  O. Mangasarian,et al.  Massive data discrimination via linear support vector machines , 2000 .

[21]  G. M. Lee,et al.  On Implicit Vector Variational Inequalities , 2000 .

[22]  Yuh-Jye Lee,et al.  RSVM: Reduced Support Vector Machines , 2001, SDM.

[23]  Thorsten Joachims,et al.  Estimating the Generalization Performance of an SVM Efficiently , 2000, ICML.

[24]  Douglas H. Fisher,et al.  A Case Study of Incremental Concept Induction , 1986, AAAI.

[25]  Richard Granger,et al.  Incremental Learning from Noisy Data , 1986, Machine Learning.

[26]  Glenn Fung,et al.  Proximal support vector machine classifiers , 2001, KDD '01.

[27]  David R. Musicant,et al.  Successive overrelaxation for support vector machines , 1999, IEEE Trans. Neural Networks.