Edinburgh Neural Machine Translation Systems for WMT 16
暂无分享,去创建一个
[1] Philip Gage,et al. A new algorithm for data compression , 1994 .
[2] Andrew McCallum,et al. Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data , 2001, ICML.
[3] Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.
[4] Razvan Pascanu,et al. On the difficulty of training recurrent neural networks , 2012, ICML.
[5] Christopher Kermorvant,et al. Dropout Improves Recurrent Neural Networks for Handwriting Recognition , 2013, 2014 14th International Conference on Frontiers in Handwriting Recognition.
[6] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[7] Zoubin Ghahramani,et al. A Theoretically Grounded Application of Dropout in Recurrent Neural Networks , 2015, NIPS.
[8] Ondrej Dusek,et al. CzEng 1.6: Enlarged Czech-English Parallel Corpus with Processing Tools Dockered , 2016, TSD.
[9] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[10] Lemao Liu,et al. Agreement on Target-bidirectional Neural Machine Translation , 2016, NAACL.
[11] Rico Sennrich,et al. Improving Neural Machine Translation Models with Monolingual Data , 2015, ACL.
[12] Rico Sennrich,et al. The AMU-UEDIN Submission to the WMT16 News Translation Task: Attention-based NMT Models as Feature Functions in Phrase-based SMT , 2016, WMT.
[13] Fabienne Braune,et al. The QT21/HimL Combined Machine Translation System , 2016, WMT.