A Study of Tuning Hyperparameters for Support Vector Machines

Automatic parameters selection is an important issue to make support vector machines (SVMs) practically useful. Most existing approaches use Newton method directly to compute the optimal parameters. They treat parameters optimization as an unconstrained optimization problem. In this paper, the limitation of these existing approached is stated and a new methodology to optimize kernel parameters, based on the computation of the gradient of penalty function with respect to the RBF kernel parameters, is proposed. Simulation results reveal the feasibility of this new approach and demonstrate an improvement of generalization ability.