RETURNN as a Generic Flexible Neural Toolkit with Application to Translation and Speech Recognition
暂无分享,去创建一个
[1] Hermann Ney,et al. CTC in the Context of Generalized Full-Sum HMM Training , 2017, INTERSPEECH.
[2] Samy Bengio,et al. Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks , 2015, NIPS.
[3] Hermann Ney,et al. Returnn: The RWTH extensible training framework for universal recurrent neural networks , 2016, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[4] Xiaodong Cui,et al. English Conversational Telephone Speech Recognition by Humans and Machines , 2017, INTERSPEECH.
[5] Marc'Aurelio Ranzato,et al. Classical Structured Prediction Losses for Sequence to Sequence Learning , 2017, NAACL.
[6] Hermann Ney,et al. Bidirectional Decoder Networks for Attention-Based End-to-End Offline Handwriting Recognition , 2016, 2016 15th International Conference on Frontiers in Handwriting Recognition (ICFHR).
[7] Tara N. Sainath,et al. Minimum Word Error Rate Training for Attention-Based Sequence-to-Sequence Models , 2017, 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[8] Pavel Levin,et al. Toward a full-scale neural machine translation in production: the Booking.com use case , 2017, MTSUMMIT.
[9] Ross B. Girshick,et al. Focal Loss for Dense Object Detection , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[10] Sergey Ioffe,et al. Rethinking the Inception Architecture for Computer Vision , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[11] Erich Elsen,et al. Deep Speech: Scaling up end-to-end speech recognition , 2014, ArXiv.
[12] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[13] John Salvatier,et al. Theano: A Python framework for fast computation of mathematical expressions , 2016, ArXiv.
[14] Hermann Ney,et al. The RWTH Aachen Machine Translation System for WMT 2010 , 2010, WMT@ACL.
[15] Pavel Levin,et al. Machine Translation at Booking.com: Journey and Lessons Learned , 2017, ArXiv.
[16] Samy Bengio,et al. Tensor2Tensor for Neural Machine Translation , 2018, AMTA.
[17] Jindrich Libovický,et al. Neural Monkey: An Open-source Tool for Sequence Learning , 2017, Prague Bull. Math. Linguistics.
[18] Yang Liu,et al. Modeling Coverage for Neural Machine Translation , 2016, ACL.
[19] Hermann Ney,et al. A comprehensive study of deep bidirectional LSTM RNNS for acoustic modeling in speech recognition , 2016, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[20] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[21] Marcin Junczys-Dowmunt,et al. Is Neural Machine Translation Ready for Deployment? A Case Study on 30 Translation Directions , 2016, IWSLT.
[22] Hermann Ney,et al. Handwriting Recognition with Large Multidimensional Long Short-Term Memory Recurrent Neural Networks , 2016, 2016 15th International Conference on Frontiers in Handwriting Recognition (ICFHR).
[23] Zheng Zhang,et al. MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems , 2015, ArXiv.
[24] Hermann Ney,et al. The RWTH Aachen Machine Translation System for WMT 2010 , 2010, IWSLT.
[25] Rico Sennrich,et al. Nematus: a Toolkit for Neural Machine Translation , 2017, EACL.
[26] Matt Post,et al. We start by defining the recurrent architecture as implemented in S OCKEYE , following , 2018 .
[27] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[28] Hermann Ney,et al. The RWTH Aachen University English-German and German-English Machine Translation System for WMT 2017 , 2017, WMT.
[29] Hermann Ney,et al. Faster sequence training , 2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[30] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.