Nonlinear Support Vector Machines Through Iterative Majorization and I-Splines

To minimize the primal support vector machine (SVM) problem, we propose to use iterative majorization. To allow for nonlinearity of the predictors, we use (non)monotone spline transformations. An advantage over the usual kernel approach in the dual problem is that the variables can be easily interpreted. We illustrate this with an example from the literature.

[1]  D. Hunter,et al.  A Tutorial on MM Algorithms , 2004 .

[2]  Eric R. Ziegel,et al.  The Elements of Statistical Learning , 2003, Technometrics.

[3]  Peter J. Huber,et al.  Robust Statistics , 2005, Wiley Series in Probability and Statistics.

[4]  J. Ramsay Monotone Regression Splines in Action , 1988 .

[5]  M. Hill,et al.  Nonlinear Multivariate Analysis. , 1990 .

[6]  Henk A. L. Kiers,et al.  Setting up alternating least squares and iterative majorization algorithms for solving various matrix optimization problems , 2002, Comput. Stat. Data Anal..

[7]  Peter J. Rousseeuw,et al.  Robust Regression and Outlier Detection , 2005, Wiley Series in Probability and Statistics.

[8]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[9]  B. Ripley,et al.  Robust Statistics , 2018, Encyclopedia of Mathematical Geosciences.

[10]  Patrick J. F. Groenen,et al.  Modern multidimensional scaling: Theory and applications, 2nd ed. , 2005 .

[11]  P. Groenen,et al.  Modern Multidimensional Scaling: Theory and Applications , 1999 .

[12]  A. Gifi,et al.  NONLINEAR MULTIVARIATE ANALYSIS , 1990 .

[13]  Jan de Leeuw,et al.  Block-relaxation Algorithms in Statistics , 1994 .

[14]  D. Hunter,et al.  Optimization Transfer Using Surrogate Objective Functions , 2000 .