Incremental kernel learning algorithms and applications

Since the Support Vector Machines (SVMs) were introduced in 1995, SVMs have been recognized as essential tools for pattern classification and function approximation. Numerous publications show that SVMs outperform other learning methods in various areas. However, SVMs have a weak performance with large-scale data sets because of high computational complexity. One approach to overcome this limitation is the incremental learning approach where a large-scale data set is divided into several subsets and trained on those subsets updating the core information extracted from the previous subset. This approach also has a drawback that the core information is accumulated during the incremental procedure. When the large-scale data set has a special structure (e.g., in the case of unbalanced data set), the standard SVM might not perform properly. In this study, a novel approach based on the reduced convex hull concept is developed and applied in various applications. In addition, the developed concept is applied to the Support Vector Regression (SVR) to produce better performance. From the performed experiments, the incremental revised SVM significantly reduces the number of support vectors and requires less computing time. In addition the incremental revised SVR produces similar results with the standard SVR by reducing computing time significantly. Furthermore, the filter concept developed in this study may be utilized to reduce the computing time in other learning approach.