Nonlinear Kernel-Based Chemometric Tools : a Machine Learning Approach

This paper provides a short introduction to support vector machines and other nonlinear kernel-based methods recently developed in machine learning research. We describe principles of construction of the nonlinear kernel-based variants of linear methods, which have been widely used in the domain of chemometrics. These include nonlinear kernel forms of the partial least squares, canonical correlation analysis, principal component analysis, principal component regression and ridge regression methods.

[1]  Anthony Widjaja,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.

[2]  Tijl De Bie,et al.  Eigenproblems in Pattern Recognition , 2005 .

[3]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[4]  B. Ripley,et al.  Robust Statistics , 2018, Encyclopedia of Mathematical Geosciences.

[5]  Roman Rosipal,et al.  Kernel PLS-SVC for Linear and Nonlinear Classification , 2003, ICML.

[6]  M. Barker,et al.  Partial least squares for discrimination , 2003 .

[7]  Gunnar Rätsch,et al.  Active Learning with Support Vector Machines in the Drug Discovery Process , 2003, J. Chem. Inf. Comput. Sci..

[8]  A. Belousov,et al.  Applicational aspects of support vector machines , 2002 .

[9]  Bernard F. Buxton,et al.  Drug Design by Machine Learning: Support Vector Machines for Pharmaceutical Data Analysis , 2001, Comput. Chem..

[10]  Roman Rosipal,et al.  Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space , 2002, J. Mach. Learn. Res..

[11]  Gunnar Rätsch,et al.  An introduction to kernel-based learning algorithms , 2001, IEEE Trans. Neural Networks.

[12]  Andrzej Cichocki,et al.  Kernel PCA for Feature Extraction and De-Noising in Nonlinear Regression , 2001, Neural Computing & Applications.

[13]  Tomaso A. Poggio,et al.  Regularization Networks and Support Vector Machines , 2000, Adv. Comput. Math..

[14]  Nello Cristianini,et al.  An introduction to Support Vector Machines , 2000 .

[15]  Hansong Zhang,et al.  Gacv for support vector machines , 2000 .

[16]  Jacob A. Wegelin,et al.  A Survey of Partial Least Squares (PLS) Methods, with Emphasis on the Two-Block Case , 2000 .

[17]  Christopher K. I. Williams Prediction with Gaussian Processes: From Linear Regression to Linear Prediction and Beyond , 1999, Learning in Graphical Models.

[18]  Alexander Gammerman,et al.  Ridge Regression Learning Algorithm in Dual Variables , 1998, ICML.

[19]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[20]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[21]  S. D. Jong,et al.  The kernel PCA algorithms for wide data. Part I: Theory and algorithms , 1997 .

[22]  Tomaso A. Poggio,et al.  Regularization Theory and Neural Networks Architectures , 1995, Neural Computation.

[23]  S. Wold,et al.  A PLS kernel algorithm for data sets with many variables and fewer objects. Part 1: Theory and algorithm , 1994 .

[24]  Bernhard E. Boser,et al.  A training algorithm for optimal margin classifiers , 1992, COLT '92.

[25]  A. Höskuldsson PLS regression methods , 1988 .

[26]  N. L. Johnson,et al.  Multivariate Analysis , 1958, Nature.

[27]  J. Mercer Functions of Positive and Negative Type, and their Connection with the Theory of Integral Equations , 1909 .

[28]  R. Shah,et al.  Least Squares Support Vector Machines , 2022 .