Learning with progressive transductive support vector machine

Support vector machine (SVM) is a new learning method developed in recent years based on the foundations of statistical learning theory. By taking a transductive approach instead of an inductive one in support vector classifiers, the working set can be used as an additional source of information about margins. Compared with traditional inductive support vector machines, transductive support vector machine is often more powerful and can give better performance. In transduction, one estimates the classification function at points within the working set using information from both the training and the working set data. This will help to improve the generalization performance of SVMs, especially when training data is inadequate. Intuitively, we would expect transductive learning to yield improvements when the training sets are small or when there is a significant deviation between the training and working set subsamples of the total population. In this paper, a progressive transductive support vector machine is addressed to extend Joachims' transductive SVM to handle different class distributions. It solves the problem of having to estimate the ratio of positive/negative examples from the working set. The experimental results show the algorithm is very promising.