Comparative study of extreme learning machine (ELM) and support vector machine (SVM) is investigated in this paper. A cross validation method for determining the appropriate number of neurons in the hidden layer is also proposed in this paper. ELM proposed by Huang, et al [3] is a novel machine-learning algorithm for single hidden-layer feedforward neural network (SLFN), which randomly chooses the input weights and hidden-layer bias, and analytically determines the output weights optimally instead of tuning them. This algorithm tends to produce good generalization ability and obtain least experience risk simultaneously with solid foundations. Benchmark tests of a real Tennessee Eastman Process (TEP) are carried out to validate its superiority. Compared with SVM, this proposed algorithm is much faster and has better generalization performance than SVM in the case studied in this paper.
[1]
Vladimir Vapnik,et al.
Statistical learning theory
,
1998
.
[2]
Olivier Chapelle,et al.
Model Selection for Support Vector Machines
,
1999,
NIPS.
[3]
Guang-Bin Huang,et al.
Learning capability and storage capacity of two-hidden-layer feedforward networks
,
2003,
IEEE Trans. Neural Networks.
[4]
Guang-Bin Huang,et al.
Extreme learning machine: a new learning scheme of feedforward neural networks
,
2004,
2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).