SLiKER: Sparse loss induced kernel ensemble regression

[1]  Yuji Matsumoto,et al.  Ridge Regression, Hubness, and Zero-Shot Learning , 2015, ECML/PKDD.

[2]  Nojun Kwak,et al.  Kernel discriminant analysis for regression problems , 2012, Pattern Recognit..

[3]  Jianping Fan,et al.  A generalized least-squares approach regularized with graph embedding for dimensionality reduction , 2020, Pattern Recognit..

[4]  Ji Feng,et al.  Deep Forest: Towards An Alternative to Deep Neural Networks , 2017, IJCAI.

[5]  Sun-Yuan Kung,et al.  Cost-effective kernel ridge regression implementation for keystroke-based active authentication system , 2017, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[6]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[7]  Bin Luo,et al.  An efficient multiple kernel learning in reproducing kernel Hilbert spaces (RKHS) , 2015, Int. J. Wavelets Multiresolution Inf. Process..

[8]  Hongwei Sun,et al.  Mercer theorem for RKHS on noncompact sets , 2005, J. Complex..

[9]  Muhammad Atif Tahir,et al.  Safe semi supervised multi-target regression (MTR-SAFER) for new targets learning , 2018, Multimedia Tools and Applications.

[10]  Jianping Fan,et al.  Least squares kernel ensemble regression in Reproducing Kernel Hilbert Space , 2018, Neurocomputing.

[11]  Joaquín Aranda Almansa,et al.  Modelling of a surface marine vehicle with kernel ridge regression confidence machine , 2019, Appl. Soft Comput..

[12]  Bin Hu,et al.  A Study on Emotion Recognition Based on Hierarchical Adaboost Multi-class Algorithm , 2018, ICA3PP.

[13]  George D. C. Cavalcanti,et al.  Dynamic classifier selection: Recent advances and perspectives , 2018, Inf. Fusion.

[14]  Timothy M. Young,et al.  Kernel Ridge Regression with Lagged-Dependent Variable: Applications to Prediction of Internal Bond Strength in a Medium Density Fiberboard Process , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[15]  Weifeng Liu,et al.  Online Laplacian-Regularized Support Vector Regression , 2017, 2017 3rd IEEE International Conference on Cybernetics (CYBCON).

[16]  R. Tibshirani,et al.  REJOINDER TO "LEAST ANGLE REGRESSION" BY EFRON ET AL. , 2004, math/0406474.

[17]  Yanchun Zhang,et al.  AdaBoost algorithm with random forests for predicting breast cancer survivability , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[18]  Ponnuthurai N. Suganthan,et al.  Ensemble Classification and Regression-Recent Developments, Applications and Future Directions [Review Article] , 2016, IEEE Computational Intelligence Magazine.

[19]  Zhihua Zhang,et al.  A non-convex relaxation approach to sparse dictionary learning , 2011, CVPR 2011.

[20]  Carlos Ordonez,et al.  Bayesian Variable Selection in Linear Regression in One Pass for Large Datasets , 2014, TKDD.

[21]  Dinggang Shen,et al.  Structured sparsity regularized multiple kernel learning for Alzheimer's disease diagnosis , 2019, Pattern Recognit..

[22]  Ethem Alpaydin,et al.  Multiple Kernel Learning Algorithms , 2011, J. Mach. Learn. Res..

[23]  Bernhard Schölkopf,et al.  Multivariate Regression via Stiefel Manifold Constraints , 2004, DAGM-Symposium.

[24]  Chao-Hua Yu,et al.  An Improved Quantum Algorithm for Ridge Regression , 2017, IEEE Transactions on Knowledge and Data Engineering.

[25]  P. Atkinson,et al.  Random Forest classification of Mediterranean land cover using multi-seasonal imagery and multi-seasonal texture , 2012 .

[26]  Gerald Schaefer,et al.  Melanoma Classification Using Dermoscopy Imaging and Ensemble Learning , 2013, 2013 2nd IAPR Asian Conference on Pattern Recognition.

[27]  Georgios B. Giannakis,et al.  Random Feature-based Online Multi-kernel Learning in Environments with Unknown Dynamics , 2017, J. Mach. Learn. Res..

[28]  Manik Varma,et al.  More generality in efficient multiple kernel learning , 2009, ICML '09.

[29]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[30]  Zheng-Jun Zha,et al.  Multi-Level Deep Cascade Trees for Conversion Rate Prediction in Recommendation System , 2018, AAAI.

[31]  P. N. Suganthan,et al.  Benchmarking Ensemble Classifiers with Novel Co-Trained Kernal Ridge Regression and Random Vector Functional Link Ensembles [Research Frontier] , 2017, IEEE Computational Intelligence Magazine.

[32]  Vince D. Calhoun,et al.  A multiple kernel learning approach to perform classification of groups from complex-valued fMRI data analysis: Application to schizophrenia , 2014, NeuroImage.

[33]  Lior Rokach,et al.  Ensemble learning: A survey , 2018, WIREs Data Mining Knowl. Discov..

[34]  Gunnar Rätsch,et al.  Large Scale Multiple Kernel Learning , 2006, J. Mach. Learn. Res..

[35]  Stefan Wager,et al.  High-Dimensional Asymptotics of Prediction: Ridge Regression and Classification , 2015, 1507.03003.

[36]  Xiao-Yuan Jing,et al.  Heterogeneous Defect Prediction Through Multiple Kernel Learning and Ensemble Learning , 2017, 2017 IEEE International Conference on Software Maintenance and Evolution (ICSME).

[37]  Jingjing Tang,et al.  A multi-kernel framework with nonparallel support vector machine , 2017, Neurocomputing.

[38]  Dapeng Tao,et al.  Manifold regularized kernel logistic regression for web image annotation , 2013, Neurocomputing.

[39]  Qinghua Hu,et al.  Kernel ridge regression for general noise model with its application , 2015, Neurocomputing.

[40]  Shiming Ge,et al.  Low-Resolution Face Recognition in the Wild via Selective Knowledge Distillation , 2018, IEEE Transactions on Image Processing.

[41]  Xiaoming Zhang,et al.  Multi-modal kernel ridge regression for social image classification , 2018, Appl. Soft Comput..