Nonlinear Support Vector Machines employ sophisticated kernel functions to classify data sets with complex decision surfaces. Determining the right parameters of such functions is not only computationally expensive, the resulting models are also susceptible to overfitting due to their large VC dimensions. Instead of fitting a nonlinear model, this paper presents a framework called Localized Support Vector Machine (LSVM), which builds multiple linear SVM models from the training data. Since each model is designed to classify a particular test example, it has high computational cost. To overcome this limitation, we propose an efficient implementation of LSVM, termed Profile SVM (PSVM). PSVM partitions the training examples into clusters and builds a separate linear SVM model for each cluster. Our empirical results show that (1) Both LSVM and PSVM outperform nonlinear SVM on the majority of the evaluated data sets; and (2) PSVM achieves comparable accuracy as LSVM but with significant computational savings.
[1]
Jitendra Malik,et al.
SVM-KNN: Discriminative Nearest Neighbor Classification for Visual Category Recognition
,
2006,
2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).
[2]
Nello Cristianini,et al.
Large Margin DAGs for Multiclass Classification
,
1999,
NIPS.
[3]
Thorsten Joachims,et al.
Transductive Inference for Text Classification using Support Vector Machines
,
1999,
ICML.
[4]
Klaus Hechenbichler,et al.
Weighted k-Nearest-Neighbor Techniques and Ordinal Classification
,
2004
.
[5]
Andrew W. Moore,et al.
Locally Weighted Learning
,
1997,
Artificial Intelligence Review.
[6]
Catherine Blake,et al.
UCI Repository of machine learning databases
,
1998
.
[7]
Peter E. Hart,et al.
Nearest neighbor pattern classification
,
1967,
IEEE Trans. Inf. Theory.