Forward-backward retraining of recurrent neural networks
暂无分享,去创建一个
[1] Alberto Del Bimbo,et al. Recurrent neural networks can be trained to be maximum a posteriori probability classifiers , 1995, Neural Networks.
[2] James L. McClelland,et al. Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .
[3] Robert A. Jacobs,et al. Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.
[4] L. Rabiner,et al. An introduction to hidden Markov models , 1986, IEEE ASSP Magazine.
[5] Andrew W. Senior,et al. Off-line Cursive Handwriting Recognition using Recurrent Neural Networks , 1994 .
[6] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[7] S. Santini,et al. Recurrent Neural Networks Can Be Trained to Be Maximum a Posteriori Probability Classiiers , 1995 .
[8] Paul J. Werbos,et al. Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.
[9] Hervé Bourlard,et al. Connectionist Speech Recognition: A Hybrid Approach , 1993 .
[10] Anthony J. Robinson,et al. An application of recurrent nets to phone probability estimation , 1994, IEEE Trans. Neural Networks.