Hybrid nets with variable parameters: a novel approach to fast learning under backpropagation
暂无分享,去创建一个
This paper presents a novel approach under regular backpropagation. We introduce hybrid neural nets that have different activation functions for different layers in fully connected feed forward neural nets. We change the parameters of activation functions in hidden layers and output layer to accelerate the learning speed and to reduce the oscillation respectively. Results on the two-spirals benchmark are presented which are better than any results under backpropagation feed forward nets using monotone activation functions published previously.<<ETX>>
[1] R. Palmer,et al. Introduction to the theory of neural computation , 1994, The advanced book program.
[2] K. Lang,et al. Learning to tell two spirals apart , 1988 .
[3] Christian Lebiere,et al. The Cascade-Correlation Learning Architecture , 1989, NIPS.
[4] Scott E. Fahlman,et al. An empirical study of learning speed in back-propagation networks , 1988 .