NEWS Article Summarization with Pretrained Transformer

[1]  David Camacho,et al.  Combining graph connectivity and genetic clustering to improve biomedical summarization , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).

[2]  Regina Barzilay,et al.  Sentence Fusion for Multidocument News Summarization , 2005, CL.

[3]  Omer Levy,et al.  BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension , 2019, ACL.

[4]  Gowtham Ramesh,et al.  Summarization of Research Publications Using Automatic Extraction , 2019, Intelligent Data Communication Technologies and Internet of Things.

[5]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[6]  Piji Li,et al.  Deep Recurrent Generative Decoder for Abstractive Text Summarization , 2017, EMNLP.

[7]  Jiajun Zhang,et al.  Read, Watch, Listen, and Summarize: Multi-Modal Summarization for Asynchronous Text, Image, Audio and Video , 2019, IEEE Transactions on Knowledge and Data Engineering.

[8]  Mirella Lapata,et al.  Text Summarization with Pretrained Encoders , 2019, EMNLP.

[9]  Ani Nenkova,et al.  A Survey of Text Summarization Techniques , 2012, Mining Text Data.

[10]  Colin Raffel,et al.  Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..

[11]  Fadl Ba-Alwi,et al.  Arabic Text Summarization Using Latent Semantic Analysis , 2015 .

[12]  Piji Li,et al.  Abstractive Multi-Document Summarization via Phrase Selection and Merging , 2015, ACL.

[13]  Phil Blunsom,et al.  Language as a Latent Variable: Discrete Generative Models for Sentence Compression , 2016, EMNLP.

[14]  Mirella Lapata,et al.  Long Short-Term Memory-Networks for Machine Reading , 2016, EMNLP.