Multiple kernel learning, conic duality, and the SMO algorithm
暂无分享,去创建一个
[1] D. Anderson,et al. Algorithms for minimization without derivatives , 1974 .
[2] Claude Lemaréchal,et al. Practical Aspects of the Moreau-Yosida Regularization: Theoretical Preliminaries , 1997, SIAM J. Optim..
[3] Paul S. Bradley,et al. Feature Selection via Concave Minimization and Support Vector Machines , 1998, ICML.
[4] Thorsten Joachims,et al. Making large-scale support vector machine learning practical , 1999 .
[5] Stephen P. Boyd,et al. Applications of second-order cone programming , 1998 .
[6] John C. Platt,et al. Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .
[7] Knud D. Andersen,et al. The Mosek Interior Point Optimizer for Linear Programming: An Implementation of the Homogeneous Algorithm , 2000 .
[8] S. Sathiya Keerthi,et al. Improvements to Platt's SMO Algorithm for SVM Classifier Design , 2001, Neural Computation.
[9] Yves Grandvalet,et al. Adaptive Scaling for Feature Selection in SVMs , 2002, NIPS.
[10] Alexander J. Smola,et al. Hyperkernels , 2002, NIPS.
[11] Nello Cristianini,et al. Learning the Kernel Matrix with Semidefinite Programming , 2002, J. Mach. Learn. Res..
[12] Sayan Mukherjee,et al. Choosing Multiple Parameters for Support Vector Machines , 2002, Machine Learning.
[13] K. Schittkowski,et al. NONLINEAR PROGRAMMING , 2022 .