Reduce Redundant Repetition Using Decoding History for Sequence-to-Sequence Summarization
暂无分享,去创建一个
[1] Christopher D. Manning,et al. Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.
[2] Aaas News,et al. Book Reviews , 1893, Buffalo Medical and Surgical Journal.
[3] Marc'Aurelio Ranzato,et al. Sequence Level Training with Recurrent Neural Networks , 2015, ICLR.
[4] Richard Socher,et al. A Deep Reinforced Model for Abstractive Summarization , 2017, ICLR.
[5] Xu Sun,et al. Decoding-History-Based Adaptive Control of Attention for Neural Machine Translation , 2018, ArXiv.
[6] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[7] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[8] Ronald J. Williams,et al. A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.
[9] Kevin Barraclough,et al. I and i , 2001, BMJ : British Medical Journal.
[10] W. Marsden. I and J , 2012 .
[11] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.