Deep Inside-outside Recursive Autoencoder with All-span Objective
暂无分享,去创建一个
Kewei Tu | Ruyue Hong | Jiong Cai | Kewei Tu | Jiong Cai | Ruyue Hong
[1] Aaron C. Courville,et al. Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks , 2018, ICLR.
[2] Beatrice Santorini,et al. Building a Large Annotated Corpus of English: The Penn Treebank , 1993, CL.
[3] Alastair Butler,et al. Keyaki Treebank: phrase structure with functional information for Japanese , 2012 .
[4] Alexander M. Rush,et al. Unsupervised Recurrent Neural Network Grammars , 2019, NAACL.
[5] Kewei Tu,et al. An Empirical Comparison of Unsupervised Constituency Parsing Methods , 2020, ACL.
[6] Alexander M. Rush,et al. Compound Probabilistic Context-Free Grammars for Grammar Induction , 2019, ACL.
[7] Graham Neubig,et al. A Tree-based Decoder for Neural Machine Translation , 2018, EMNLP.
[8] Prakhar Gupta,et al. Learning Word Vectors for 157 Languages , 2018, LREC.
[9] Aaron C. Courville,et al. Neural Language Modeling by Jointly Learning Syntax and Lexicon , 2017, ICLR.
[10] Yoshimasa Tsuruoka,et al. Learning to Parse and Translate Improves Neural Machine Translation , 2017, ACL.
[11] Christopher D. Manning,et al. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.
[12] Mohit Yadav,et al. Unsupervised Latent Tree Induction with Deep Inside-Outside Recursive Auto-Encoders , 2019, NAACL.
[13] Luke S. Zettlemoyer,et al. Deep Semantic Role Labeling: What Works and What’s Next , 2017, ACL.
[14] Daniel Gildea,et al. The Necessity of Parsing for Predicate Argument Recognition , 2002, ACL.
[15] J. Baker. Trainable grammars for speech recognition , 1979 .
[16] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.