暂无分享,去创建一个
David Weiss | Lingpeng Kong | Chris Alberti | Daniel Andor | Ivan Bogatyy | Lingpeng Kong | Chris Alberti | D. Andor | Ivan Bogatyy | David Weiss
[1] Quoc V. Le,et al. Multi-task Sequence to Sequence Learning , 2015, ICLR.
[2] Slav Petrov,et al. Globally Normalized Transition-Based Neural Networks , 2016, ACL.
[3] Jakob Uszkoreit,et al. A Decomposable Attention Model for Natural Language Inference , 2016, EMNLP.
[4] Yasemin Altun,et al. Overcoming the Lack of Parallel Data in Sentence Compression , 2013, EMNLP.
[5] Guillaume Lample,et al. Neural Architectures for Named Entity Recognition , 2016, NAACL.
[6] Christopher D. Manning,et al. Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks , 2010 .
[7] Jeffrey Pennington,et al. Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection , 2011, NIPS.
[8] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[9] Yue Zhang,et al. Transition-Based Neural Word Segmentation , 2016, ACL.
[10] Yuan Zhang,et al. Stack-propagation: Improved Representation Learning for Syntax , 2016, ACL.
[11] Eliyahu Kiperwasser,et al. Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations , 2016, TACL.
[12] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[13] Joakim Nivre,et al. Inductive Dependency Parsing , 2006, Text, speech and language technology.
[14] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[15] Wei Xu,et al. Bidirectional LSTM-CRF Models for Sequence Tagging , 2015, ArXiv.
[16] Geoffrey E. Hinton,et al. Grammar as a Foreign Language , 2014, NIPS.
[17] Noah A. Smith,et al. Distilling an Ensemble of Greedy Dependency Parsers into One MST Parser , 2016, EMNLP.
[18] Christopher Potts,et al. A Fast Unified Model for Parsing and Sentence Understanding , 2016, ACL.
[19] Noah A. Smith,et al. Transition-Based Dependency Parsing with Stack Long Short-Term Memory , 2015, ACL.
[20] Felice Dell'Orletta,et al. Reverse Revision and Linear Tree Combination for Dependency Parsing , 2009, HLT-NAACL.
[21] Phil Blunsom,et al. Recurrent Continuous Translation Models , 2013, EMNLP.
[22] Dianhai Yu,et al. Multi-Task Learning for Multiple Language Translation , 2015, ACL.
[23] Christoph Goller,et al. Learning task-dependent distributed representations by backpropagation through structure , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).
[24] Wang Ling,et al. Finding Function in Form: Compositional Character Models for Open Vocabulary Word Representation , 2015, EMNLP.
[25] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[26] Eduard H. Hovy,et al. When Are Tree Structures Necessary for Deep Learning of Representations? , 2015, EMNLP.
[27] Christopher D. Manning,et al. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.
[28] Slav Petrov,et al. Structured Training for Neural Network Transition-Based Parsing , 2015, ACL.