Kyoto University Participation to WAT 2017
暂无分享,去创建一个
Sadao Kurohashi | Raj Dabre | Toshiaki Nakazawa | Fabien Cromierès | Toshiaki Nakazawa | S. Kurohashi | Fabien Cromierès | Raj Dabre
[1] Yang Liu,et al. Modeling Coverage for Neural Machine Translation , 2016, ACL.
[2] Karin M. Verspoor,et al. Findings of the 2016 Conference on Machine Translation , 2016, WMT.
[3] Graham Neubig,et al. Overview of the 3rd Workshop on Asian Translation , 2015, WAT@COLING.
[4] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[5] Isao Goto,et al. Detecting Untranslated Content for Neural Machine Translation , 2017, NMT@ACL.
[6] Chenhui Chu,et al. KyotoEBMT System Description for the 2nd Workshop on Asian Translation , 2015, WAT.
[7] Quoc V. Le,et al. Addressing the Rare Word Problem in Neural Machine Translation , 2014, ACL.
[8] Fabien Cromierès,et al. Kyoto-NMT: a Neural Machine Translation implementation in Chainer , 2016, COLING.
[9] Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.
[10] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[11] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[12] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[13] Alon Lavie,et al. Combining Machine Translation Output with Open Source: The Carnegie Mellon Multi-Engine Machine Translation Scheme , 2010, Prague Bull. Math. Linguistics.
[14] Graham Neubig,et al. Overview of the 5th Workshop on Asian Translation , 2019, PACLIC.
[15] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[16] Jiajun Zhang,et al. Towards Zero Unknown Word in Neural Machine Translation , 2016, IJCAI.
[17] Chenhui Chu,et al. Chinese-Japanese Machine Translation Exploiting Chinese Characters , 2013, ACM Trans. Asian Lang. Inf. Process..
[18] Kenta Oono,et al. Chainer : a Next-Generation Open Source Framework for Deep Learning , 2015 .
[19] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[20] Rico Sennrich,et al. The AMU-UEDIN Submission to the WMT16 News Translation Task: Attention-based NMT Models as Feature Functions in Phrase-based SMT , 2016, WMT.
[21] Wojciech Zaremba,et al. Recurrent Neural Network Regularization , 2014, ArXiv.
[22] Toshiaki Nakazawa,et al. ASPEC: Asian Scientific Paper Excerpt Corpus , 2016, LREC.
[23] Kenneth Heafield,et al. KenLM: Faster and Smaller Language Model Queries , 2011, WMT@EMNLP.
[24] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[25] Chenhui Chu,et al. Kyoto University Participation to WAT 2016 , 2016, COLING 2016.
[26] Rico Sennrich,et al. Edinburgh Neural Machine Translation Systems for WMT 16 , 2016, WMT.
[27] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[28] Chenhui Chu,et al. Consistent Word Segmentation, Part-of-Speech Tagging and Dependency Labelling Annotation for Chinese Language , 2016, COLING.
[29] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.