The backpropagation unfold recurrent rule

A novel learning algorithm for recurrent backpropagation (BP) networks is introduced. Existing learning algorithms of the recurrent models are mostly based on the static generalized delta-rule of the BP algorithm which does not give satisfactory results for a recurrent network. The authors derived the BP unfold recurrent rule (BURR) by unfolding the structure of the recurrent network. Using this learning rule, the connecting weights are updated by accumulating the effects of the weight change in each time interval. Experimental results are presented to demonstrate the memorizing ability of a recurrent network using this learning algorithm. Sequences of input characters were successfully recognized.<<ETX>>