Multilayer Incremental Hybrid Cost-Sensitive Extreme Learning Machine With Multiple Hidden Output Matrix and Subnetwork Hidden Nodes

Recently, multilayer extreme learning machine with subnetwork nodes and cost sensitive learning was applied to representation learning and classification. However, the existing ML-ELM methods suffer from several drawbacks: 1) manual tuning for the number of hidden nodes in each layer will influent the training speed of the model, the best optimal determine approach for the number of hidden nodes is still for probing; 2) random projection of the input weight and bias in each layer still suboptimal and the network structure hard to reach best generalization performance. Inspired by the Subnetwork nodes and cost sensitive methods, using multiple hidden output matrix, a novel ELM method which called Multilayer Incremental Hybrid Cost Sensitive Extreme Learning Machine with Multiple Hidden Output Matrix and Subnetwork Hidden Nodes is proposed in this paper. Whose contributions are 1) a hybrid hidden nodes optimization methods is given which based on ant clone method and multiple grey wolf optimization method; 2) a multiple hidden layer output matrices are being using through weighted calculation of different output matrices which can optimize the network architecture effectively; 3) a novel ELM structure is proposed by merging the merits of Cost Sensitive Extreme Learning Machine and Subnetwork Hidden Nodes, while the regularized parameter $C$ is not sensitive to the generalization performance. Extensive experiments results show that the proposed method achieves better performance than existing state-of-art ELM methods.

[1]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[2]  Chi-Man Vong,et al.  Sparse Bayesian Extreme Learning Machine for Multi-classification , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[3]  Meng Joo Er,et al.  Parsimonious Extreme Learning Machine Using Recursive Orthogonal Least Squares , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[4]  A. Kai Qin,et al.  Evolutionary extreme learning machine , 2005, Pattern Recognit..

[5]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[6]  Chee Kheong Siew,et al.  Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes , 2006, IEEE Transactions on Neural Networks.

[7]  Sanjoy Dasgupta,et al.  A neural algorithm for a fundamental computing problem , 2017 .

[8]  Jian Sun,et al.  Cost-Sensitive Extreme Learning Machine , 2013, ADMA.

[9]  Jianping Yin,et al.  Boosting weighted ELM for imbalanced learning , 2014, Neurocomputing.

[10]  Danilo Comminiello,et al.  Online Sequential Extreme Learning Machine With Kernels , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[11]  Wei Wang,et al.  Improved Convex Incremental Extreme Learning Machine Based on Enhanced Random Search , 2014 .

[12]  Sanjoy Dasgupta,et al.  A neural algorithm for a fundamental computing problem , 2017, Science.

[13]  Maya Cakmak,et al.  Visual Categorization with Random Projection , 2015, Neural Computation.

[14]  Yimin Yang,et al.  Extreme Learning Machine With Subnetwork Hidden Nodes for Regression and Classification , 2016, IEEE Transactions on Cybernetics.

[15]  Yiqiang Chen,et al.  Weighted extreme learning machine for imbalance learning , 2013, Neurocomputing.

[16]  Zongben Xu,et al.  Universal Approximation of Extreme Learning Machine With Adaptive Growth of Hidden Nodes , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[17]  Yan Yang,et al.  Dimension Reduction With Extreme Learning Machine , 2016, IEEE Transactions on Image Processing.