Graph-based broad learning system for classification

Abstract Broad learning system (BLS) is viewed as an alternative method of deep neural network (DNN). Comparing with DNN, BLS can reach a similar performance with much faster learning speed. BLS contains two kinds of features: the mapped features and the enhancement nodes. The mapped features are obtained by randomly generated connecting weights and biases which are then fine-tuned by sparse autoencoder. But sparse autoencoder, as an unsupervised leaning algorithm, will lose the information of ground truths. The enhancement nodes are formed based on the mapped features with randomly generated weights and biases. However, this random process cannot guarantee the quality of these formed nodes. It means BLS may be unable to learn useful enough representations of original data. Considering the effectiveness of extreme learning machine based autoencoder (ELM-AE), we first propose a novel graph-based ELM-AE (GBEAE) method. Then, a GBEAE-based broad learning system (GBEAE-BLS) is introduced to overcome the disadvantages discussed above. Finally, we use 23 datasets to test the performance of GBEAE-BLS. Experimental results demonstrate that GBEAE-BLS has evident superiorities on these datasets, comparing with related learning algorithms.

[1]  Chunxia Zhang,et al.  Generalized extreme learning machine autoencoder and a new deep neural network , 2017, Neurocomputing.

[2]  Jiangtao Wen,et al.  Intelligent Bearing Fault Diagnosis Method Combining Compressed Data Acquisition and Deep Learning , 2018, IEEE Transactions on Instrumentation and Measurement.

[3]  Fen Fang,et al.  Combining Faster R-CNN and Model-Driven Clustering for Elongated Object Detection , 2020, IEEE Transactions on Image Processing.

[4]  Zenglin Xu,et al.  Structured Graph Learning for Clustering and Semi-supervised Classification , 2020, Pattern Recognit..

[5]  Zenglin Xu,et al.  Robust Graph Learning From Noisy Data , 2018, IEEE Transactions on Cybernetics.

[6]  C. L. Philip Chen,et al.  Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[7]  C. L. Philip Chen,et al.  Universal Approximation Capability of Broad Learning System and Its Structural Variations , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[8]  Miroslaw Bober,et al.  REMAP: Multi-Layer Entropy-Guided Pooling of Dense CNN Features for Image Retrieval , 2019, IEEE Transactions on Image Processing.

[9]  Mohammad Mehdi Arefi,et al.  A Novel Application of Deep Belief Networks in Learning Partial Discharge Patterns for Classifying Corona, Surface, and Internal Discharges , 2020, IEEE Transactions on Industrial Electronics.

[10]  Sen Zhang,et al.  Prediction model of permeability index for blast furnace based on the improved multi-layer extreme learning machine and wavelet transform , 2017, J. Frankl. Inst..

[11]  Dejan J. Sobajic,et al.  Learning and generalization characteristics of the random vector Functional-link net , 1994, Neurocomputing.

[12]  Björn W. Schuller,et al.  Autoencoder-based Unsupervised Domain Adaptation for Speech Emotion Recognition , 2014, IEEE Signal Processing Letters.

[13]  Sam Yang,et al.  Volume element model mesh generation strategy and its application in ship thermal analysis , 2015, Adv. Eng. Softw..

[14]  Wesley De Neve,et al.  Visually weighted neighbor voting for image tag relevance learning , 2014, Multimedia Tools and Applications.

[15]  Cheng Wu,et al.  Semi-Supervised and Unsupervised Extreme Learning Machines , 2014, IEEE Transactions on Cybernetics.

[16]  Zongben Xu,et al.  Universal Approximation of Extreme Learning Machine With Adaptive Growth of Hidden Nodes , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[17]  Wenhua Wang,et al.  Classification by semi-supervised discriminative regularization , 2010, Neurocomputing.

[18]  Bidyut Baran Chaudhuri,et al.  HybridSN: Exploring 3-D–2-D CNN Feature Hierarchy for Hyperspectral Image Classification , 2019, IEEE Geoscience and Remote Sensing Letters.

[19]  Richard Lippmann,et al.  Review of Neural Networks for Speech Recognition , 1989, Neural Computation.

[20]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[21]  Onur Cömert,et al.  Classification and diagnosis of cervical cancer with stacked autoencoder and softmax classification , 2019, Expert Systems with Applications.

[22]  Hongming Zhou,et al.  Extreme Learning Machines [Trends & Controversies] , 2013 .

[23]  Zheng Liu,et al.  Variances-constrained weighted extreme learning machine for imbalanced classification , 2020, Neurocomputing.

[24]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[25]  Richard Bamler,et al.  Tomographic SAR Inversion by $L_{1}$ -Norm Regularization—The Compressive Sensing Approach , 2010, IEEE Transactions on Geoscience and Remote Sensing.

[26]  Dianhui Wang,et al.  Distributed learning for Random Vector Functional-Link networks , 2015, Inf. Sci..

[27]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[28]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[29]  Liana G. Apostolova,et al.  Comparison of AdaBoost and Support Vector Machines for Detecting Alzheimer's Disease Through Automated Hippocampal Segmentation , 2010, IEEE Transactions on Medical Imaging.

[30]  Zenglin Xu,et al.  Discriminative Semi-Supervised Feature Selection Via Manifold Regularization , 2009, IEEE Transactions on Neural Networks.

[31]  C. L. Philip Chen,et al.  Hyperspectral Imagery Classification Based on Semi-Supervised Broad Learning System , 2018, Remote. Sens..

[32]  Kemal Adem,et al.  Diagnosis of breast cancer with Stacked autoencoder and Subspace kNN , 2020 .

[33]  Wu Deng,et al.  Semi-Supervised Broad Learning System Based on Manifold Regularization and Broad Network , 2020, IEEE Transactions on Circuits and Systems I: Regular Papers.

[34]  Araceli Sanchis,et al.  Generating ensembles of heterogeneous classifiers using Stacked Generalization , 2015, WIREs Data Mining Knowl. Discov..

[35]  Yan Yang,et al.  Dimension Reduction With Extreme Learning Machine , 2016, IEEE Transactions on Image Processing.