Continuous Time Recurrent Neural Networks for Grammatical Induction
暂无分享,去创建一个
[1] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[2] Jeffrey L. Elman,et al. Finding Structure in Time , 1990, Cogn. Sci..
[3] Lars Niklasson,et al. Can Connectionist Models Exhibit Non-Classical Structure Sensitivity? , 2019, Proceedings of the Sixteenth Annual Conference of the Cognitive Science Society.
[4] L. Ingber. Very fast simulated re-annealing , 1989 .
[5] James L. McClelland,et al. Learning and Applying Contextual Constraints in Sentence Comprehension , 1990, Artif. Intell..
[6] Emanuel Marom,et al. Efficient Training of Recurrent Neural Network with Time Delays , 1997, Neural Networks.
[7] Barak A. Pearlmutter. Gradient calculations for dynamic recurrent neural networks: a survey , 1995, IEEE Trans. Neural Networks.
[8] Jordan B. Pollack,et al. Recursive Distributed Representations , 1990, Artif. Intell..
[9] David J. Chalmers,et al. Syntactic Transformations on Distributed Representations , 1990 .
[10] Barry L. Kalman,et al. TRAINREC: A System for Training Feedforward & Simple Recurrent Networks Efficiently and Correctly , 1993 .
[11] Stefan Wermter. Hybrid Connectionist Natural Language Processing , 1994 .