Cutting Plane Method for Continuously Constrained Kernel-Based Regression

Incorporating constraints into the kernel-based regression is an effective means to improve regression performance. Nevertheless, in many applications, the constraints are continuous with respect to some parameters so that computational difficulties arise. Discretizing the constraints is a reasonable solution for these difficulties. However, in the context of kernel-based regression, most of existing works utilize the prior discretization strategy; this strategy suffers from a few inherent deficiencies: it cannot ensure that the regression result totally fulfills the original constraints and can hardly tackle high-dimensional problems. This paper proposes a cutting plane method (CPM) for constrained kernel-based regression problems and a relaxed CPM (R-CPM) for high-dimensional problems. The CPM discretizes the continuous constraints iteratively and ensures that the regression result strictly fulfills the original constraints. For high-dimensional problems, the R-CPM accepts a slight and controlled violation to attain a dimensional-independent computational complexity. The validity of the proposed methods is verified by numerical experiments.

[1]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[2]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[3]  L. Hedberg,et al.  Function Spaces and Potential Theory , 1995 .

[4]  Marco A. López,et al.  Semi-infinite programming , 2007, Eur. J. Oper. Res..

[5]  Anthony Widjaja,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.

[6]  Tor Arne Johansen,et al.  Identification of non-linear systems using empirical data and prior knowledge - an optimization approach , 1996, Autom..

[7]  Olvi L. Mangasarian,et al.  Nonlinear Knowledge in Kernel Approximation , 2007, IEEE Transactions on Neural Networks.

[8]  Chih-Jen Lin,et al.  Relaxed Cutting Plane Method for Solving Linear Semi-Infinite Programming Problems , 1998 .

[9]  Johan A. K. Suykens,et al.  Primal-Dual Monotone Kernel Regression , 2005, Neural Processing Letters.

[10]  Jack L. Meador,et al.  Encoding a priori information in feedforward networks , 1991, Neural Networks.

[11]  中村 泰,et al.  ハッシュ関数を用いたGaussian Process Regressionの高速化 , 2012 .

[12]  De-zhao Chen,et al.  Prior-knowledge-based Feedforward Network Simulation of True Boiling Point Curve of Crude Oil , 2001, Comput. Chem..

[13]  Ya-Xiang Yuan,et al.  Optimization Theory and Methods: Nonlinear Programming , 2010 .

[14]  Jude W. Shavlik,et al.  Knowledge-Based Kernel Approximation , 2004, J. Mach. Learn. Res..

[15]  Huangang Wang,et al.  Nonlinear Model Based Control with prior knowledge Based Learning , 2009, 2009 7th Asian Control Conference.

[16]  Gérard Bloch,et al.  Incorporating prior knowledge in support vector regression , 2007, Machine Learning.

[17]  Rihard Karba,et al.  Incorporating prior knowledge into artificial neural networks - an industrial case study , 2004, Neurocomputing.

[18]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..