暂无分享,去创建一个
[1] Bowen Zhou,et al. Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond , 2016, CoNLL.
[2] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[3] Taku Kudo,et al. SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing , 2018, EMNLP.
[4] Yang Liu,et al. Modeling Coverage for Neural Machine Translation , 2016, ACL.
[5] Alexander M. Rush,et al. Bottom-Up Abstractive Summarization , 2018, EMNLP.
[6] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[7] Geoffrey E. Hinton,et al. Layer Normalization , 2016, ArXiv.
[8] Chin-Yew Lin,et al. ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.
[9] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[10] Hang Li,et al. “ Tony ” DNN Embedding for “ Tony ” Selective Read for “ Tony ” ( a ) Attention-based Encoder-Decoder ( RNNSearch ) ( c ) State Update s 4 SourceVocabulary Softmax Prob , 2016 .
[11] Navdeep Jaitly,et al. Pointer Networks , 2015, NIPS.
[12] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[13] Christopher D. Manning,et al. Get To The Point: Summarization with Pointer-Generator Networks , 2017, ACL.
[14] Yejin Choi,et al. Deep Communicating Agents for Abstractive Summarization , 2018, NAACL.
[15] Mirella Lapata,et al. Don’t Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization , 2018, EMNLP.
[16] Elena Lloret,et al. The challenging task of summary evaluation: an overview , 2017, Language Resources and Evaluation.
[17] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[18] Yejin Choi,et al. The Curious Case of Neural Text Degeneration , 2019, ICLR.
[19] Richard Socher,et al. A Deep Reinforced Model for Abstractive Summarization , 2017, ICLR.
[20] Lukasz Kaiser,et al. Generating Wikipedia by Summarizing Long Sequences , 2018, ICLR.
[21] Alexander M. Rush,et al. Abstractive Sentence Summarization with Attentive Recurrent Neural Networks , 2016, NAACL.
[22] Quoc V. Le,et al. Semi-supervised Sequence Learning , 2015, NIPS.
[23] Piji Li,et al. Actor-Critic based Training Framework for Abstractive Summarization , 2018, ArXiv.
[24] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[25] Sanja Fidler,et al. Efficient Summarization with Read-Again and Copy Mechanism , 2016, ArXiv.
[26] Sebastian Ruder,et al. Universal Language Model Fine-tuning for Text Classification , 2018, ACL.
[27] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[28] Jason Weston,et al. A Neural Attention Model for Abstractive Sentence Summarization , 2015, EMNLP.
[29] George Kurian,et al. Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.
[30] Natalie Schluter,et al. The limits of automatic summarisation according to ROUGE , 2017, EACL.