暂无分享,去创建一个
[1] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[2] R. Baayen,et al. Mixed-effects modeling with crossed random effects for subjects and items , 2008 .
[3] Yonghui Wu,et al. Exploring the Limits of Language Modeling , 2016, ArXiv.
[4] Emmanuel Dupoux,et al. Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies , 2016, TACL.
[5] Jeffrey L. Elman,et al. Finding Structure in Time , 1990, Cogn. Sci..
[6] Roger Levy,et al. What do RNN Language Models Learn about Filler–Gap Dependencies? , 2018, BlackboxNLP@EMNLP.
[7] R. Levy. Expectation-based syntactic comprehension , 2008, Cognition.
[8] Tal Linzen,et al. Targeted Syntactic Evaluation of Language Models , 2018, EMNLP.
[9] Masaya Yoshida,et al. On the origin of islands , 2014 .
[10] John Robert Ross,et al. Constraints on variables in syntax , 1967 .
[11] C. Pollard,et al. Center for the Study of Language and Information , 2022 .
[12] Robert Frank,et al. Revisiting the poverty of the stimulus: hierarchical generalization without a hierarchical bias in recurrent neural networks , 2018, CogSci.
[13] John Hale,et al. A Probabilistic Earley Parser as a Psycholinguistic Model , 2001, NAACL.
[14] Nathaniel J. Smith,et al. The effect of word predictability on reading time is logarithmic , 2013, Cognition.
[15] S. A. Chowdhury,et al. RNN Simulations of Grammaticality Judgments on Long-distance Dependencies , 2018, COLING.
[16] Edouard Grave,et al. Colorless Green Recurrent Networks Dream Hierarchically , 2018, NAACL.
[17] J. Elman. Distributed representations, simple recurrent networks, and grammatical structure , 1991, Machine Learning.
[18] D. Barr,et al. Random effects structure for confirmatory hypothesis testing: Keep it maximal. , 2013, Journal of memory and language.
[19] M. Masson. Using confidence intervals for graphically based data interpretation. , 2003, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.