A New Method for SVM Hyper-parameters Optimization

The performance of Support Vector Machine(SVM) is determined by its hyper-parameters.Optimizing the hyper-parameters needs a criterion.This paper presents a new SVM hyper-parameters optimization method,in which maximizing the minimum algebraic distance from samples to the class-separating hyper-surface in input space is taken as the criterion.The main purpose of this method is to' leg and leg' the whole original input space for all the samples,and it sustains the structural risk minimization principle better.The method is simple,geometric intuitive and can be implemented easily.The feasibility of the method is displayed through experiments on two classical benchmark classification problems—Two Spirals Problem(TSP) and Iris samples.