Accurate on-line v-support vector learning

The ν-Support Vector Machine (ν-SVM) for classification proposed by Schölkopf et al. has the advantage of using a parameter ν on controlling the number of support vectors and margin errors. However, comparing to standard C-Support Vector Machine (C-SVM), its formulation is more complicated, up until now there are no effective methods on solving accurate on-line learning for it. In this paper, we propose a new effective accurate on-line algorithm which is designed based on a modified formulation of the original ν-SVM. The accurate on-line algorithm includes two special steps: the first one is relaxed adiabatic incremental adjustments; the second one is strict restoration adjustments. The experiments on several benchmark datasets demonstrate that using these two steps the accurate on-line algorithm can avoid the infeasible updating path as far as possible, and successfully converge to the optimal solution. It achieves the fast convergence especially on the Gaussian kernel and is faster than the batch algorithm.

[1]  Robert Tibshirani,et al.  The Entire Regularization Path for the Support Vector Machine , 2004, J. Mach. Learn. Res..

[2]  Gert Cauwenberghs,et al.  Incremental and Decremental Support Vector Machine Learning , 2000, NIPS.

[3]  Klaus-Robert Müller,et al.  Incremental Support Vector Learning: Analysis, Implementation and Applications , 2006, J. Mach. Learn. Res..

[4]  Pavel Brazdil,et al.  Proceedings of the European Conference on Machine Learning , 1993 .

[5]  Nello Cristianini,et al.  The Kernel-Adatron Algorithm: A Fast and Simple Learning Procedure for Support Vector Machines , 1998, ICML.

[6]  Bernhard Schölkopf,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2005, IEEE Transactions on Neural Networks.

[7]  David R. Musicant,et al.  Successive overrelaxation for support vector machines , 1999, IEEE Trans. Neural Networks.

[8]  Chih-Jen Lin,et al.  A tutorial on?-support vector machines , 2005 .

[9]  Ji Zhu,et al.  Computing the Solution Path for the Regularized Support Vector Regression , 2005, NIPS.

[10]  Mario Martín,et al.  On-Line Support Vector Machine Regression , 2002, ECML.

[11]  Stefan Rüping,et al.  Incremental Learning with Support Vector Machines , 2001, ICDM.

[12]  Cheng-Chew Lim,et al.  An Implementation of Training Dual-nu Support Vector Machines , 2005 .

[13]  Mokhtar S. Bazaraa,et al.  Nonlinear Programming: Theory and Algorithms , 1993 .

[14]  P. Schönemann On artificial intelligence , 1985, Behavioral and Brain Sciences.

[15]  Koby Crammer,et al.  Online Passive-Aggressive Algorithms , 2003, J. Mach. Learn. Res..

[16]  A. Shashua,et al.  Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems , 2002 .

[17]  Chih-Jen Lin,et al.  Training v-Support Vector Classifiers: Theory and Algorithms , 2001, Neural Computation.

[18]  Bin Gu,et al.  On-line off-line Ranking Support Vector Machine and analysis , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[19]  Arthur Gretton,et al.  On-line one-class support vector machines. An application to signal segmentation , 2003, 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03)..

[20]  Gang Wang,et al.  A kernel path algorithm for support vector machines , 2007, ICML '07.

[21]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[22]  Gert Cauwenberghs,et al.  SVM incremental learning, adaptation and optimization , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[23]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[24]  Bernhard Schölkopf,et al.  A tutorial on ν-support vector machines: Research Articles , 2005 .