Kernel Risk-Sensitive Loss based Hyper-graph Regularized Robust Extreme Learning Machine and Its Semi-supervised Extension for Classification

Abstract Kernel Risk-Sensitive Loss (KRSL) is a nonlinear similarity measure defined in kernel space, which enable the gradient based method to achieve higher accuracy while effectively weakening the negative effects caused by noise and outliers. Defined as a kernel function expectation between two random variables, KRSL has been successfully applied in robust machine learning and signal processing. Extreme Learning Machine, as one of the most popular methods of machine learning, has attracted great attention in supervised learning and semi-supervised learning. However, when the data contains noise and outliers, the manifold structure of the data is not considered or the neural network structure is too complex, the performance of traditional ELM methods will decline. Therefore, based on KRSL, hyper-graph regularization and L 2,1 -norm, we first propose a more robust ELM method named Kernel Risk-Sensitive Loss Based Hyper-graph Regularized Robust Extreme Learning Machine (KRSL-HRELM). In KRSL-HRELM, KRSL is introduced into ELM to enhance its ability to handle noise and outliers. Moreover, the hyper-graph regularization is integrated into the method to learn the higher-order geometric structure information between the data. In addition, the L 2,1 -norm is introduced to constrain the output weight matrix to obtain a sparse network model. Inspired by other semi-supervised ELM methods, we extend KRSL-HRELM to semi-supervised learning and propose its semi-supervised version semi-supervised KRSL-HRELM (SS- KRSL-HRELM). Empirical studies on a large number of real-world datasets show that the proposed methods are competitive with other advanced supervised or semi-supervised learning methods in terms of robustness and efficiency.

[1]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[2]  Qing Shen,et al.  Urban Traffic Congestion Evaluation Based on Kernel the Semi-Supervised Extreme Learning Machine , 2017, Symmetry.

[3]  Yu Zhang,et al.  Multi-kernel extreme learning machine for EEG classification in brain-computer interfaces , 2018, Expert Syst. Appl..

[4]  Badong Chen,et al.  Regularized correntropy criterion based semi-supervised ELM , 2020, Neural Networks.

[5]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[6]  Nanning Zheng,et al.  Kernel Risk-Sensitive Loss: Definition, Properties and Application to Robust Adaptive Filtering , 2016, IEEE Transactions on Signal Processing.

[7]  Bernhard Schölkopf,et al.  Learning with Hypergraphs: Clustering, Classification, and Embedding , 2006, NIPS.

[8]  Gokhan Bilgin,et al.  MCK-ELM: multiple composite kernel extreme learning machine for hyperspectral images , 2019, Neural Computing and Applications.

[9]  Cheng Wu,et al.  Semi-Supervised and Unsupervised Extreme Learning Machines , 2014, IEEE Transactions on Cybernetics.

[10]  Xin Wang,et al.  Mixture correntropy for robust learning , 2018, Pattern Recognit..

[11]  Nanning Zheng,et al.  Robust Learning With Kernel Mean $p$ -Power Error Loss , 2016, IEEE Transactions on Cybernetics.

[12]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[13]  Punyaphol Horata,et al.  Robust extreme learning machine , 2013, Neurocomputing.

[14]  Shifei Ding,et al.  An overview on Restricted Boltzmann Machines , 2018, Neurocomputing.

[15]  Hongming Zhou,et al.  Optimization method based extreme learning machine for classification , 2010, Neurocomputing.

[16]  Marianna Bolla,et al.  Spectra, Euclidean representations and clusterings of hypergraphs , 1993, Discret. Math..

[17]  Ying-Lian Gao,et al.  Hyper-Graph Regularized Constrained NMF for Selecting Differentially Expressed Genes and Tumor Classification , 2020, IEEE Journal of Biomedical and Health Informatics.

[18]  Mingming Liu,et al.  A Semi-supervised Low Rank Kernel Learning Algorithm via Extreme Learning Machine , 2017 .

[19]  Hayder Mahmood Salman,et al.  Text Classification Based on Weighted Extreme Learning Machine , 2019, Ibn AL- Haitham Journal For Pure and Applied Science.

[20]  Bo Liu,et al.  Image classification based on effective extreme learning machine , 2013, Neurocomputing.

[21]  Xiong Luo,et al.  A Quantized Kernel Learning Algorithm Using a Minimum Kernel Risk-Sensitive Loss Criterion and Bilateral Gradient Technique , 2017, Entropy.

[22]  Juan Wang,et al.  Robust hypergraph regularized non-negative matrix factorization for sample clustering and feature selection in multi-view gene expression data , 2019, Human Genomics.

[23]  Yong Xu,et al.  Correntropy-Based Hypergraph Regularized NMF for Clustering and Feature Selection on Multi-Cancer Integrated Data , 2020, IEEE Transactions on Cybernetics.

[24]  Jun Yu,et al.  Multi-view hypergraph learning by patch alignment framework , 2013, Neurocomputing.

[25]  Yu Xue,et al.  A review on multi-class TWSVM , 2017, Artificial Intelligence Review.

[26]  Nan Zhang,et al.  Unsupervised and semi-supervised extreme learning machine with wavelet kernel for high dimensional data , 2016, Memetic Computing.

[27]  Weikuan Jia,et al.  Adversarial Training Methods for Boltzmann Machines , 2020, IEEE Access.

[28]  Guang-Bin Huang,et al.  An Insight into Extreme Learning Machines: Random Neurons, Random Features and Kernels , 2014, Cognitive Computation.

[29]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.

[30]  Qiang Liu,et al.  Random Fourier extreme learning machine with ℓ2, 1-norm regularization , 2016, Neurocomputing.

[31]  Yafei Song,et al.  $L_{21}$ -Norm Based Loss Function and Regularization Extreme Learning Machine , 2019, IEEE Access.

[32]  Xiong Luo,et al.  Short-Term Wind Speed Forecasting via Stacked Extreme Learning Machine With Generalized Correntropy , 2018, IEEE Transactions on Industrial Informatics.

[33]  Zhuo Ren,et al.  Correntropy-based robust extreme learning machine for classification , 2018, Neurocomputing.

[34]  Fang Liu,et al.  Joint sparse regularization based Sparse Semi-Supervised Extreme Learning Machine (S3ELM) for classification , 2015, Knowl. Based Syst..

[35]  Minxia Luo,et al.  Outlier-robust extreme learning machine for regression problems , 2015, Neurocomputing.

[36]  Zhe Yang,et al.  C-loss based extreme learning machine for estimating power of small-scale turbojet engine , 2019, Aerospace Science and Technology.

[37]  Lijuan Wang,et al.  Multi-view RBM with posterior consistency and domain adaptation , 2020, Inf. Sci..

[38]  Faxian Cao,et al.  Extreme Learning Machine With Enhanced Composite Feature for Spectral-Spatial Hyperspectral Image Classification , 2018, IEEE Access.

[39]  F. L. Chen,et al.  Sales forecasting system based on Gray extreme learning machine with Taguchi method in retail industry , 2011, Expert Syst. Appl..

[40]  K. Tomczak,et al.  The Cancer Genome Atlas (TCGA): an immeasurable source of knowledge , 2015, Contemporary oncology.

[41]  Guang-Bin Huang,et al.  Extreme Learning Machine for Multilayer Perceptron , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[42]  Cheng Wu,et al.  Domain Space Transfer Extreme Learning Machine for Domain Adaptation , 2019, IEEE Transactions on Cybernetics.

[43]  Sheng Huang,et al.  Improved hypergraph regularized Nonnegative Matrix Factorization with sparse representation , 2018, Pattern Recognit. Lett..

[44]  Xiaofei He,et al.  Multi-Target Regression via Robust Low-Rank Learning , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[45]  Martine D. F. Schlag,et al.  Multi-level spectral hypergraph partitioning with arbitrary vertex sizes , 1996, Proceedings of International Conference on Computer Aided Design.

[46]  Dong Sun Park,et al.  Online sequential extreme learning machine with forgetting mechanism , 2012, Neurocomputing.

[47]  Guang-Bin Huang,et al.  Extreme learning machine: a new learning scheme of feedforward neural networks , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[48]  Ran He,et al.  Robust Principal Component Analysis Based on Maximum Correntropy Criterion , 2011, IEEE Transactions on Image Processing.

[49]  Nan Zhang,et al.  Twin support vector machine: theory, algorithm and applications , 2017, Neural Computing and Applications.

[50]  Yi Yu,et al.  Robust Nonlinear Adaptive Filter Based on Kernel Risk-Sensitive Loss for Bilinear Forms , 2018, Circuits, Systems, and Signal Processing.

[51]  Na Li,et al.  Multi-label Text Categorization Using $$L_{21}$$-norm Minimization Extreme Learning Machine , 2017 .

[52]  Yicong Zhou,et al.  Learning Hierarchical Spectral–Spatial Features for Hyperspectral Image Classification , 2016, IEEE Transactions on Cybernetics.

[53]  Yu Zhang,et al.  EEG classification using sparse Bayesian extreme learning machine for brain–computer interface , 2018, Neural Computing and Applications.