Margin-like quantities and generalized approximate cross validation for support vector machines

We examine support vector machines (SVM) from the point of view of solutions to variational problems in a reproducing kernel Hilbert space. We discuss the generalized comparative Kullback-Leibler distance as a target for choosing tuning parameters in SVMs, and we propose that the generalized approximate cross validation estimate of them is a reasonable proxy for this target. We indicate an interesting relationship between the generalized approximate cross validation and the SVM margin.