Eleman Neural Network has been widely used in various fields, from classification to the prediction categories in natural language data. However, the local minima problem usually occurs in the process of the learning. To solve this problem and to speed up the process of the convergence, we propose an improved learning method by adding a term in error function which relates to the neuron saturation of the hidden layer for the Eleman Neural Network. The activation functions are adapted to prevent neurons in the hidden layer from getting into deep saturation area. We apply this method to the Boolean Series Prediction Questions to demonstrate its validity. The simulation result shows that the proposed algorithm can avoid the local minima problem, largely accelerate the speed of the convergence and get good results for the simulation tasks.
[1]
Jeffrey L. Elman,et al.
Finding Structure in Time
,
1990,
Cogn. Sci..
[2]
Hung Hom.
Process Identification using a Modified Elman Neural Net
,
1994
.
[3]
Seref Sagiroglu,et al.
Elman Network with Embedded Memory for System Identification
,
2006,
J. Inf. Sci. Eng..
[4]
George Cybenko,et al.
Approximation by superpositions of a sigmoidal function
,
1992,
Math. Control. Signals Syst..
[5]
Duc Truong Pham,et al.
Identification of linear and nonlinear dynamic systems using recurrent neural networks
,
1993,
Artif. Intell. Eng..
[6]
C. Lee Giles,et al.
Extraction of rules from discrete-time recurrent neural networks
,
1996,
Neural Networks.