An efficient method for computing leave-one-out error in support vector machines with Gaussian kernels
暂无分享,去创建一个
S. Sathiya Keerthi | Dennis DeCoste | Chong Jin Ong | Martin M. S. Lee | D. DeCoste | S. Keerthi | C. Ong | Martin M. S. Lee
[1] David Haussler,et al. Probabilistic kernel regression models , 1999, AISTATS.
[2] Massimiliano Pontil,et al. Leave One Out Error, Stability, and Generalization of Voting Combinations of Classifiers , 2004, Machine Learning.
[3] Kiri Wagstaff,et al. Alpha seeding for support vector machines , 2000, KDD '00.
[4] S. Sathiya Keerthi,et al. Evaluation of simple performance measures for tuning SVM hyperparameters , 2003, Neurocomputing.
[5] S. Sathiya Keerthi,et al. Improvements to Platt's SMO Algorithm for SVM Classifier Design , 2001, Neural Computation.
[7] Thorsten Joachims,et al. Estimating the Generalization Performance of an SVM Efficiently , 2000, ICML.
[8] John Platt,et al. Fast training of svms using sequential minimal optimization , 1998 .
[9] Jiao Licheng,et al. Automatic model selection for support vector machines using heuristic genetic algorithm , 2006 .
[10] Tong Zhang,et al. A Leave-One-out Cross Validation Bound for Kernel Methods with Applications in Learning , 2001, COLT/EuroCOLT.
[11] V. Vapnik,et al. Bounds on Error Expectation for Support Vector Machines , 2000, Neural Computation.
[12] Chih-Jen Lin,et al. Asymptotic convergence of an SMO algorithm without any assumptions , 2002, IEEE Trans. Neural Networks.
[13] Chih-Jen Lin,et al. A Simple Decomposition Method for Support Vector Machines , 2002, Machine Learning.
[14] Vladimir Vapnik,et al. Statistical learning theory , 1998 .