Kernel continuum regression

The continuum regression technique provides an appealing regression framework connecting ordinary least squares, partial least squares and principal component regression in one family. It offers some insight on the underlying regression model for a given application. Moreover, it helps to provide deep understanding of various regression techniques. Despite the useful framework, however, the current development on continuum regression is only for linear regression. In many applications, nonlinear regression is necessary. The extension of continuum regression from linear models to nonlinear models using kernel learning is considered. The proposed kernel continuum regression technique is quite general and can handle very flexible regression model estimation. An efficient algorithm is developed for fast implementation. Numerical examples have demonstrated the usefulness of the proposed technique.

[1]  M. Stone Continuum regression: Cross-validated sequentially constructed prediction embracing ordinary least s , 1990 .

[2]  Genevera I. Allen Automatic Feature Selection via Weighted Kernels and Regularization , 2013 .

[3]  Andrzej Cichocki,et al.  Kernel PCA for Feature Extraction and De-Noising in Nonlinear Regression , 2001, Neural Computing & Applications.

[4]  G. Wahba,et al.  Some results on Tchebycheffian spline functions , 1971 .

[5]  R. Sundberg Continuum Regression and Ridge Regression , 1993 .

[6]  Anders Björkström,et al.  A Generalized View on Continuum Regression , 1999 .

[7]  Peter Filzmoser,et al.  Robust continuum regression , 2005 .

[8]  Tormod Næs,et al.  Comparison of prediction methods for multicollinear data , 1985 .

[9]  Nello Cristianini,et al.  Kernel Methods for Pattern Analysis , 2003, ICTAI.

[10]  Roman Rosipal,et al.  Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space , 2002, J. Mach. Learn. Res..

[11]  R. Cook,et al.  Some insights into continuum regression and its asymptotic properties , 2010 .

[12]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[13]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[14]  D. Massart,et al.  The Radial Basis Functions — Partial Least Squares approach as a flexible non-linear regression technique , 1996 .

[15]  I. Helland Partial least squares regression and statistical models , 1990 .

[16]  Hao Helen Zhang,et al.  Component selection and smoothing in multivariate nonparametric regression , 2006, math/0702659.

[17]  A. Cichocki,et al.  Kernel Principal Component Regression with EM Approach to Nonlinear Principal Components Extraction , 2001 .

[18]  Yufeng Liu,et al.  Linear or Nonlinear? Automatic Structure Discovery for Partially Linear Models , 2011, Journal of the American Statistical Association.

[19]  J. Friedman,et al.  A Statistical View of Some Chemometrics Regression Tools , 1993 .

[20]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[21]  Gunnar Rätsch,et al.  Kernel PCA and De-Noising in Feature Spaces , 1998, NIPS.

[22]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.