Time constrain optimal method to find the minimum architectures for feedforward neural networks
暂无分享,去创建一个
Huang, et al. (1996, 2002) proposed architecture selection algorithm called SEDNN to find the minimum architectures for feedforward neural networks based on the Golden section search method and the upper bounds on the number of hidden neurons, as stated in Huang (2002) and Huang et al. (1998), to be 2/spl radic/((m + 2)N) or two layered feedforward network (TLFN) and N for single layer feedforward network (SLFN) where N is the number of training samples and m is the number of output neurons. The SEDNN algorithm worked well with the assumption that time allowed for the execution of the algorithm is infinite. This paper proposed an algorithm similar to the SEDNN, but with an added time factor to cater for applications that requires results within a specified period of time.
[1] Guang-Bin Huang,et al. Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions , 1998, IEEE Trans. Neural Networks.
[2] Guang-Bin Huang,et al. Learning capability and storage capacity of two-hidden-layer feedforward networks , 2003, IEEE Trans. Neural Networks.