STACL: Simultaneous Translation with Integrated Anticipation and Controllable Latency

Simultaneous translation, which translates sentences before they are finished, is useful in many scenarios but is notoriously difficult due to word-order differences and simultaneity requirements. We introduce a very simple yet surprisingly effective “wait-k” model trained to generate the target sentence concurrently with the source sentence, but always k words behind, for any given k. This framework seamlessly integrates anticipation and translation in a single model with only minor changes to the seq2seq framework. We also formulate a new latency metric that addresses deficiencies in previous ones. Experiments show our strategy achieves low latency and reasonable BLEU scores (compared to full-sentence translation) on both Chinese-to-English and English-to-German simultaneous translation.1

[1]  Barbara Moser-Mercer,et al.  Prolonged turns in interpreting: effects on quality, physiological and psychological stress (Pilot study) , 1998 .

[2]  Noah A. Smith,et al.  You May Not Need Attention , 2018, ArXiv.

[3]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[4]  Graham Neubig,et al.  Learning to Translate in Real-time with Neural Machine Translation , 2016, EACL.

[5]  Mingbo Ma,et al.  When to Finish? Optimal Beam Search for Neural Text Generation (modulo beam size) , 2017, EMNLP.

[6]  Tomoki Toda,et al.  Syntax-based Simultaneous Translation through Prediction of Unseen Syntactic Constituents , 2015, ACL.

[7]  Robert Legvold,et al.  My Years With Gorbachev and Shevardnadze: The Memoir of a Soviet Interpreter , 1997 .

[8]  Mingbo Ma,et al.  Breaking the Beam Search Curse: A Study of (Re-)Scoring Methods and Stopping Criteria for Neural Machine Translation , 2018, EMNLP.

[9]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[10]  Hua Wu,et al.  Improved Neural Machine Translation with SMT Features , 2016, AAAI.

[11]  Alexander M. Rush,et al.  OpenNMT: Open-Source Toolkit for Neural Machine Translation , 2017, ACL.

[12]  Rico Sennrich,et al.  Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.

[13]  He He,et al.  Interpretese vs. Translationese: The Uniqueness of Human Strategies in Simultaneous Interpretation , 2016, NAACL.

[14]  Kyunghyun Cho,et al.  Can neural machine translation do simultaneous translation? , 2016, ArXiv.

[15]  Jordan L. Boyd-Graber,et al.  Don't Until the Final Verb Wait: Reinforcement Learning for Simultaneous Machine Translation , 2014, EMNLP.