The paper discusses and presents the use and calculation of the e x- plicit bias term b in the support vector machines (SVMs) within the Iterative Single t raining Data learning Algorithm (ISDA). The approach proposed can be used for both nonlinear classification and nonlinear regression tasks. Unlike the other iterative methods in solvin g the SVMs learning problems containing the huge data sets, such as sequential minimal optimization (SMO) and its variants that must use at least two training data pairs, the algorithms shown here use the single training data based iteration routine for so lving QP learning problem. In this way the various 2 nd order heuristics in choosing the data for an updating is avoided. This makes the proposed ISD learning method remarkably quick. The algorithm can also be thought off as an application of a classic Gauss-Seidel (GS) coordinate ascent procedure and its derivative known as the successive over-relaxation (SOR) algorithm in SVMs learning from huge data sets subject to both the box constraints and the equality ones. (The later coming from min i- mizing the primal objective function in respect to the bias term b). The final solution in a dual domain is not an approximate one, but it is the optimal set of dual variables which would have been obtained by using any of existing and proven QP problem solvers if they only could deal with huge data sets. Proceedings of 12
[1]
Konstantinos Veropoulos,et al.
Medical Decision Making
,
2011,
Yamada's Textbook of Gastroenterology.
[2]
Alexander J. Smola,et al.
Learning with Kernels: support vector machines, regularization, optimization, and beyond
,
2001,
Adaptive computation and machine learning series.
[3]
Jian-xiong Dong,et al.
A Fast SVM Training Algorithm
,
2003,
Int. J. Pattern Recognit. Artif. Intell..
[4]
Nello Cristianini,et al.
The Kernel-Adatron : A fast and simple learning procedure for support vector machines
,
1998,
ICML 1998.
[5]
Vojislav Kecman,et al.
On the equality of kernel AdaTron and sequential minimal optimization in classification and regression tasks and alike algorithms for kernel machines
,
2003,
ESANN.
[7]
John C. Platt,et al.
Fast training of support vector machines using sequential minimal optimization, advances in kernel methods
,
1999
.
[8]
David R. Musicant,et al.
Successive overrelaxation for support vector machines
,
1999,
IEEE Trans. Neural Networks.