Application of fully recurrent neural networks for speech recognition

The authors describe an extended backpropagation algorithm for fully connected recurrent neural networks applied to speech recognition. The extended delta rule is approximated by excluding some of the past activities of the dynamic neurons to reduce computational complexity without performance degradation. In speaker-dependent recognition of a confusable syllable set, the fully recurrent neural network with the approximated backpropagation algorithm showed better performance than the multilayer perceptron and the self-recurrent network with comparable time complexity. In addition, it is found that most self-recurrent connections become excitatory and most mutual recurrent connections become inhibitory.<<ETX>>