On implicit Lagrangian twin support vector regression by Newton method

AbstractIn this work, an implicit Lagrangian for the dual twin support vector regression is proposed. Our formulation leads to determining non-parallel e –insensitive down- and up- bound functions for the unknown regressor by constructing two unconstrained quadratic programming problems of smaller size, instead of a single large one as in the standard support vector regression (SVR). The two related support vector machine type problems are solved using Newton method. Numerical experiments were performed on a number of interesting synthetic and real-world benchmark datasets and their results were compared with SVR and twin SVR. Similar or better generalization performance of the proposed method clearly illustrates its effectiveness and applicability.

[1]  Olvi L. Mangasarian,et al.  Nonlinear complementarity as unconstrained and constrained minimization , 1993, Math. Program..

[2]  Martin Casdagli,et al.  Nonlinear prediction of chaotic time series , 1989 .

[3]  Yuh-Jye Lee,et al.  epsilon-SSVR: A Smooth Support Vector Machine for epsilon-Insensitive Regression , 2005, IEEE Trans. Knowl. Data Eng..

[4]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[5]  Min Wang,et al.  Seeking multi-thresholds directly from support vectors for image segmentation , 2005, Neurocomputing.

[6]  Yuh-Jye Lee,et al.  2-SSVR : A Smooth Support Vector Machine for 2-insensitive Regression , 2004 .

[7]  Glenn Fung,et al.  Finite Newton method for Lagrangian support vector machine classification , 2003, Neurocomputing.

[8]  Federico Girosi,et al.  Training support vector machines: an application to face detection , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[9]  X. Wanga,et al.  Sparse support vector regression based on orthogonal forward selection for the generalised kernel model , 2005 .

[10]  Madan Gopal,et al.  Application of smoothing technique on twin support vector machines , 2008, Pattern Recognit. Lett..

[11]  Nello Cristianini,et al.  An introduction to Support Vector Machines , 2000 .

[12]  Lutgarde M. C. Buydens,et al.  Using support vector machines for time series prediction , 2003 .

[13]  S. Balasundaram,et al.  On Lagrangian support vector regression , 2010, Expert Syst. Appl..

[14]  Jian Yang,et al.  Smooth twin support vector regression , 2010, Neural Computing and Applications.

[15]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[16]  Johan A. K. Suykens,et al.  Optimal control by least squares support vector machines , 2001, Neural Networks.

[17]  Ping Zhong,et al.  Training twin support vector regression via linear programming , 2012, Neural Computing and Applications.

[18]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[19]  Xinjun Peng,et al.  TSVR: An efficient Twin Support Vector Machine for regression , 2010, Neural Networks.

[20]  S. Balasundaram,et al.  On finite Newton method for support vector regression , 2010, Neural Computing and Applications.

[21]  George E. P. Box,et al.  Time Series Analysis: Forecasting and Control , 1977 .

[22]  Sheng Chen,et al.  Sparse support vector regression based on orthogonal forward selection for the generalised kernel model , 2006, Neurocomputing.

[23]  Yan Chen,et al.  Error tolerance based support vector machine for regression , 2011, Neurocomputing.

[24]  David R. Musicant,et al.  Active set support vector regression , 2004, IEEE Transactions on Neural Networks.

[25]  Madan Gopal,et al.  Least squares twin support vector machines for pattern classification , 2009, Expert Syst. Appl..

[26]  Gunnar Rätsch,et al.  Using support vector machines for time series prediction , 1999 .

[27]  Olvi L. Mangasarian,et al.  Nonlinear Programming , 1969 .

[28]  Reshma Khemchandani,et al.  Twin Support Vector Machines for Pattern Classification , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[29]  D. Haussler,et al.  Knowledge-based analysis of microarray gene expression , 2000 .

[30]  F. Girosi,et al.  Nonlinear prediction of chaotic time series using support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[31]  Gwilym M. Jenkins,et al.  Time series analysis, forecasting and control , 1971 .

[32]  J. Hiriart-Urruty,et al.  Generalized Hessian matrix and second-order optimality conditions for problems withC1,1 data , 1984 .

[33]  Olvi L. Mangasarian,et al.  Multisurface proximal support vector machine classification via generalized eigenvalues , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[34]  S. Balasundaram,et al.  Finite Newton method for implicit Lagrangian support vector regression , 2011, Int. J. Knowl. Based Intell. Eng. Syst..

[35]  David R. Musicant,et al.  Lagrangian Support Vector Machines , 2001, J. Mach. Learn. Res..