Robust Regularized Kernel Regression

Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber-epsiv insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches.

[1]  Simon Haykin,et al.  Generalized support vector machines , 1999, ESANN.

[2]  Koby Crammer,et al.  Robust Support Vector Machine Training via Convex Outlier Ablation , 2006, AAAI.

[3]  Bernhard Schölkopf,et al.  A Generalized Representer Theorem , 2001, COLT/EuroCOLT.

[4]  Honglak Lee,et al.  Efficient L1 Regularized Logistic Regression , 2006, AAAI.

[5]  S. Sathiya Keerthi,et al.  Building Support Vector Machines with Reduced Classifier Complexity , 2006, J. Mach. Learn. Res..

[6]  Frederick R. Forst,et al.  On robust estimation of the location parameter , 1980 .

[7]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[8]  Tomaso A. Poggio,et al.  Regularization Networks and Support Vector Machines , 2000, Adv. Comput. Math..

[9]  Olvi L. Mangasarian,et al.  A finite newton method for classification , 2002, Optim. Methods Softw..

[10]  Alexander Gammerman,et al.  Ridge Regression Learning Algorithm in Dual Variables , 1998, ICML.

[11]  Shun-Feng Su,et al.  Robust support vector regression networks for function approximation with outliers , 2002, IEEE Trans. Neural Networks.

[12]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[13]  Michael R. Lyu,et al.  A Multi-Scale Tikhonov Regularization Scheme for Implicit Surface Modelling , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[14]  David R. Musicant,et al.  Robust Linear and Support Vector Regression , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  Wei Chu,et al.  Bayesian support vector regression using a unified loss function , 2004, IEEE Transactions on Neural Networks.

[16]  Olivier Chapelle,et al.  Training a Support Vector Machine in the Primal , 2007, Neural Computation.

[17]  P. J. Huber The 1972 Wald Lecture Robust Statistics: A Review , 1972 .

[18]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[19]  Geoffrey E. Hinton,et al.  Bayesian Learning for Neural Networks , 1995 .

[20]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[21]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[22]  Yuh-Jye Lee,et al.  epsilon-SSVR: A Smooth Support Vector Machine for epsilon-Insensitive Regression , 2005, IEEE Trans. Knowl. Data Eng..