Linear Twin Quadratic Surface Support Vector Regression

Twin support vector regression (TSVR) generates two nonparallel hyperplanes by solving a pair of smaller-sized problems instead of a single larger-sized problem in the standard SVR. Due to its efficiency, TSVR is frequently applied in various areas. In this paper, we propose a totally new version of TSVR named Linear Twin Quadratic Surface Support Vector Regression (LTQSSVR), which directly uses two quadratic surfaces in the original space for regression. It is worth noting that our new approach not only avoids the notoriously difficult and time-consuming task for searching a suitable kernel function and its corresponding parameters in the traditional SVR-based method but also achieves a better generalization performance. Besides, in order to make further improvement on the efficiency and robustness of the model, we introduce the 1-norm to measure the error. The linear programming structure of the new model skips the matrix inverse operation and makes it solvable for those huge-sized problems. As we know, the capability of handling large-sized problem is very important in this big data era. In addition, to verify the effectiveness and efficiency of our model, we compare it with some well-known methods. The numerical experiments on 2 artificial data sets and 12 benchmark data sets demonstrate the validity and applicability of our proposed method.

[1]  Manoj Prabhakaran Kumar,et al.  Detecting facial emotions using normalized minimal feature vectors and semi-supervised twin support vector machines classifier , 2019, Applied Intelligence.

[2]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[3]  Reshma Khemchandani,et al.  Twin Support Vector Machines for Pattern Classification , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Ping Zhong,et al.  Training twin support vector regression via linear programming , 2012, Neural Computing and Applications.

[5]  Yaonan Wang,et al.  A Three-Domain Fuzzy Support Vector Regression for Image Denoising and Experimental Studies , 2014, IEEE Transactions on Cybernetics.

[6]  Johan A. K. Suykens,et al.  Least squares support vector machine classifiers: a large scale algorithm , 1999 .

[7]  Serpil Sayin,et al.  Exploring the trade-off between generalization and empirical errors in a one-norm SVM , 2012, Eur. J. Oper. Res..

[8]  Anthony Widjaja,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.

[9]  Xinjun Peng,et al.  TSVR: An efficient Twin Support Vector Machine for regression , 2010, Neural Networks.

[10]  Feng Zeng,et al.  An Interior Point Method for L1/2-SVM and Application to Feature Selection in Classification , 2014, J. Appl. Math..

[11]  Jing Zhao,et al.  Twin least squares support vector regression , 2013, Neurocomputing.

[12]  Ingo Steinwart,et al.  Consistency of support vector machines and other regularized kernel classifiers , 2005, IEEE Transactions on Information Theory.

[13]  Jian Luo,et al.  An intuitionistic fuzzy set based S3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$^3$$\end{document}VM model for bin , 2017, Fuzzy Optimization and Decision Making.

[14]  Hoda Mashayekhi,et al.  Survival Prediction and Feature Selection in Patients with Breast Cancer Using Support Vector Regression , 2016, Comput. Math. Methods Medicine.

[15]  Jun Gao,et al.  L1-norm loss-based projection twin support vector machine for binary classification , 2019, Soft Computing.

[16]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[17]  Yuan-Hai Shao,et al.  Nonparallel hyperplane support vector machine for binary classification problems , 2014, Inf. Sci..

[18]  Xiao-Lei Xia,et al.  A novel sparse least-squares support vector machine , 2012, 2012 5th International Conference on BioMedical Engineering and Informatics.

[19]  Cheng-Lung Huang,et al.  A hybrid SOFM-SVR with a filter-based feature selection for stock market forecasting , 2009, Expert Syst. Appl..

[20]  Shu-Cherng Fang,et al.  FUZZY QUADRATIC SURFACE SUPPORT VECTOR MACHINE BASED ON FISHER DISCRIMINANT ANALYSIS , 2015 .

[21]  Kongqiao Wang,et al.  Distributed Object Detection With Linear SVMs , 2014, IEEE Transactions on Cybernetics.

[22]  Chuanfa Chen,et al.  Least absolute deviation-based robust support vector regression , 2017, Knowl. Based Syst..

[23]  Koh Takeuchi,et al.  Towards Automatic Image Understanding and Mining via Social Curation , 2012, 2012 IEEE 12th International Conference on Data Mining.

[24]  Donald F. Specht,et al.  A general regression neural network , 1991, IEEE Trans. Neural Networks.

[25]  Lai Soon Lee,et al.  Credit Scoring: A Review on Support Vector Machines and Metaheuristic Approaches , 2019, Adv. Oper. Res..

[26]  WuKun,et al.  lp-norm multikernel learning approach for stock market price forecasting , 2012 .

[27]  Shu-Cherng Fang,et al.  A kernel-free quadratic surface support vector machine for semi-supervised learning , 2016, J. Oper. Res. Soc..

[29]  Jian Luo,et al.  A New Fuzzy Set and Nonkernel SVM Approach for Mislabeled Binary Classification With Applications , 2017, IEEE Transactions on Fuzzy Systems.

[30]  J. Friedman,et al.  Predicting Multivariate Responses in Multiple Linear Regression , 1997 .