Extreme learning machine: Theory and applications
暂无分享,去创建一个
[1] Chee Kheong Siew,et al. Real-time learning capability of neural networks , 2006, IEEE Trans. Neural Networks.
[2] Chee Kheong Siew,et al. Can threshold networks be trained directly? , 2006, IEEE Transactions on Circuits and Systems II: Express Briefs.
[3] Dianhui Wang,et al. Protein sequence classification using extreme learning machine , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..
[4] Robert F. Stengel,et al. Smooth function approximation using neural networks , 2005, IEEE Transactions on Neural Networks.
[5] C. Siew,et al. Extreme Learning Machine with Randomly Assigned RBF Kernels , 2005 .
[6] Chee Kheong Siew,et al. Extreme learning machine: RBF network case , 2004, ICARCV 2004 8th Control, Automation, Robotics and Vision Conference, 2004..
[7] Guang-Bin Huang,et al. Learning capability and storage capacity of two-hidden-layer feedforward networks , 2003, IEEE Trans. Neural Networks.
[8] D. Serre. Matrices: Theory and Applications , 2002 .
[9] E. Romero,et al. A new incremental method for function approximation using feed-forward neural networks , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).
[10] Chih-Jen Lin,et al. A comparison of methods for multiclass support vector machines , 2002, IEEE Trans. Neural Networks.
[11] Samy Bengio,et al. A Parallel Mixture of SVMs for Very Large Scale Problems , 2001, Neural Computation.
[12] Guang-Bin Huang,et al. Classification ability of single hidden layer feedforward neural networks , 2000, IEEE Trans. Neural Networks Learn. Syst..
[13] W. Godwin. Article in Press , 2000 .
[14] Peter L. Bartlett,et al. The Sample Complexity of Pattern Classification with Neural Networks: The Size of the Weights is More Important than the Size of the Network , 1998, IEEE Trans. Inf. Theory.
[15] Guang-Bin Huang,et al. Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions , 1998, IEEE Trans. Neural Networks.
[16] Catherine Blake,et al. UCI Repository of machine learning databases , 1998 .
[17] Gunnar Rätsch,et al. An Improvement of AdaBoost to Avoid Overfitting , 1998, ICONIP.
[18] Shin'ichi Tamura,et al. Capabilities of a four-layered feedforward neural network: four layers versus three , 1997, IEEE Trans. Neural Networks.
[19] Yoav Freund,et al. Experiments with a New Boosting Algorithm , 1996, ICML.
[20] Tony R. Martinez,et al. Heterogeneous radial basis function networks , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).
[21] Christopher J. Merz,et al. UCI Repository of Machine Learning Databases , 1996 .
[22] S. Hyakin,et al. Neural Networks: A Comprehensive Foundation , 1994 .
[23] Allan Pinkus,et al. Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function , 1991, Neural Networks.
[24] Kurt Hornik,et al. Approximation capabilities of multilayer feedforward networks , 1991, Neural Networks.
[25] D. W. Lewis. Matrix theory , 1991 .
[26] K. S. Banerjee,et al. Generalized Inverse of Matrices and Its Applications , 1973 .
[27] A. Mayne. Generalized Inverse of Matrices and its Applications , 1972 .