暂无分享,去创建一个
[1] Salim Roukos,et al. Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.
[2] Etienne Barnard,et al. STATISTICAL TRANSLATION WITH SCARCE RESOURCES: A SOUTH AFRICAN CASE STUDY , 2006 .
[3] D. V. Niekerk. Exploring unsupervised word segmentation for machine translation in the South African context , 2014 .
[4] Edouard Grave,et al. Reducing Transformer Depth on Demand with Structured Dropout , 2019, ICLR.
[5] Steve Kroon,et al. Critical initialisation for deep signal propagation in noisy rectifier neural networks , 2018, NeurIPS.
[6] Myle Ott,et al. Facebook FAIR’s WMT19 News Translation Task Submission , 2019, WMT.
[7] Laura Martinus,et al. Towards Neural Machine Translation for African Languages , 2018, ArXiv.
[8] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[9] Hendrik J. Groenewald,et al. Corpora for Three South African Language Pairs in the Autshumato Project , 2010 .
[10] Walter Scheirer,et al. Auto-Sizing the Transformer Network: Improving Speed, Efficiency, and Performance for Low-Resource Machine Translation , 2019, EMNLP.
[11] Ondrej Bojar,et al. Training Tips for the Transformer Model , 2018, Prague Bull. Math. Linguistics.
[12] Julian Salazar,et al. Transformers without Tears: Improving the Normalization of Self-Attention , 2019, ArXiv.
[13] Garrison W. Cottrell,et al. ReZero is All You Need: Fast Convergence at Large Depth , 2020, UAI.
[14] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[15] Laura Martinus,et al. A Focus on Neural Machine Translation for African Languages , 2019, ArXiv.