Statistical learning theory and its application to pattern recognition

The problem of pattern recognition is formulated as a classification in the statistic learning theory. Vapnik constructed a class of learning algorithms called support vector machine (SMV) to solve the problem. The algorithm not only has strong theoretical foundation but also provides a powerful tool for solving real-life problems. But it still has some drawbacks. Tow of them are 1) the computational complexity of finding the optimal separating hyperplane is quite high in the linearly separable case, and 2) in the linearly non-separable case, for any given sample set it's hard to choose a proper nonlinear mapping (kernel function) such that the sample set is linearly separable in the new space after the mapping. To overcome these drawbacks, we presented some new approaches. The main idea and some experimental results of the approaches are presented.