Finding Hierarchical Structure in Neural Stacks Using Unsupervised Parsing
暂无分享,去创建一个
William W. Merrill | Lenny Khazan | Noah Amsel | Yiding Hao | Simon Mendelsohn | Robert Frank | R. Frank | Will Merrill | Yiding Hao | Lenny Khazan | Noah Amsel | S. Mendelsohn
[1] Robert Frank,et al. Context-Free Transductions with Neural Stacks , 2018, BlackboxNLP@EMNLP.
[2] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[3] Noam Chomsky,et al. Three models for the description of language , 1956, IRE Trans. Inf. Theory.
[4] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[5] Tomas Mikolov,et al. Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets , 2015, NIPS.
[6] Ann Bies,et al. The Penn Treebank: Annotating Predicate Argument Structure , 1994, HLT.
[7] Yoshua Bengio,et al. Straight to the Tree: Constituency Parsing with Neural Syntactic Distance , 2018, ACL.
[8] Emmanuel Dupoux,et al. Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies , 2016, TACL.
[9] Phil Blunsom,et al. Learning to Transduce with Unbounded Memory , 2015, NIPS.
[10] Aaron C. Courville,et al. Neural Language Modeling by Jointly Learning Syntax and Lexicon , 2017, ICLR.
[11] Edouard Grave,et al. Colorless Green Recurrent Networks Dream Hierarchically , 2018, NAACL.
[12] Wang Ling,et al. Memory Architectures in Recurrent Neural Network Language Models , 2018, ICLR.
[13] Tal Linzen,et al. Targeted Syntactic Evaluation of Language Models , 2018, EMNLP 2018.
[14] Samuel R. Bowman,et al. Grammar Induction with Neural Language Models: An Unusual Replication , 2018, EMNLP.