暂无分享,去创建一个
Yoshimasa Tsuruoka | Kyunghyun Cho | Akiko Eriguchi | Kyunghyun Cho | Akiko Eriguchi | Yoshimasa Tsuruoka
[1] Razvan Pascanu,et al. Understanding the exploding gradient problem , 2012, ArXiv.
[2] Yoav Goldberg,et al. Towards String-To-Tree Neural Machine Translation , 2017, ACL.
[3] Razvan Pascanu,et al. On the difficulty of training recurrent neural networks , 2012, ICML.
[4] Yoshimasa Tsuruoka,et al. Neural Machine Translation with Source-Side Latent Graph Parsing , 2017, EMNLP.
[5] Noah A. Smith,et al. What Do Recurrent Neural Network Grammars Learn About Syntax? , 2016, EACL.
[6] Kevin Duh,et al. Automatic Evaluation of Translation Quality for Distant Language Pairs , 2010, EMNLP.
[7] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[8] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[9] Yoshimasa Tsuruoka,et al. Tree-to-Sequence Attentional Neural Machine Translation , 2016, ACL.
[10] Xing Shi,et al. Does String-Based Neural MT Learn Source Syntax? , 2016, EMNLP.
[11] Bill Byrne,et al. Syntactically Guided Neural Machine Translation , 2016, ACL.
[12] Jason Weston,et al. Natural Language Processing (Almost) from Scratch , 2011, J. Mach. Learn. Res..
[13] Noah A. Smith,et al. Recurrent Neural Network Grammars , 2016, NAACL.
[14] Yoshua Bengio,et al. Describing Multimedia Content Using Attention-Based Encoder-Decoder Networks , 2015, IEEE Transactions on Multimedia.
[15] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[16] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[17] Benoît Sagot,et al. From Noisy Questions to Minecraft Texts: Annotation Challenges in Extreme Syntax Scenario , 2016, NUT@COLING.
[18] Rich Caruana,et al. Multitask Learning , 1998, Encyclopedia of Machine Learning and Data Mining.
[19] Daniel Jurafsky,et al. Distant supervision for relation extraction without labeled data , 2009, ACL.
[20] Jason Lee,et al. Fully Character-Level Neural Machine Translation without Explicit Segmentation , 2016, TACL.
[21] Salim Roukos,et al. Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.
[22] Rico Sennrich,et al. Linguistic Input Features Improve Neural Machine Translation , 2016, WMT.
[23] Noah A. Smith,et al. Transition-Based Dependency Parsing with Stack Long Short-Term Memory , 2015, ACL.
[24] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[25] Tommi S. Jaakkola,et al. Tree-structured decoding with doubly-recurrent neural networks , 2016, ICLR.
[26] Pradeep Dubey,et al. BlackOut: Speeding up Recurrent Neural Network Language Models With Very Large Vocabularies , 2015, ICLR.
[27] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[28] Graham Neubig,et al. Pointwise Prediction for Robust, Adaptable Japanese Morphological Analysis , 2011, ACL.
[29] Yoshua Bengio,et al. A Character-level Decoder without Explicit Segmentation for Neural Machine Translation , 2016, ACL.
[30] Geoffrey E. Hinton,et al. Distilling the Knowledge in a Neural Network , 2015, ArXiv.
[31] Philipp Koehn,et al. Statistical Significance Tests for Machine Translation Evaluation , 2004, EMNLP.
[32] Slav Petrov,et al. Globally Normalized Transition-Based Neural Networks , 2016, ACL.
[33] Philipp Koehn,et al. Moses: Open Source Toolkit for Statistical Machine Translation , 2007, ACL.
[34] Wojciech Zaremba,et al. An Empirical Exploration of Recurrent Network Architectures , 2015, ICML.
[35] Ashish Vaswani,et al. Simple, Fast Noise-Contrastive Estimation for Large RNN Vocabularies , 2016, HLT-NAACL.