Generating Semantically Similar and Human-Readable Summaries With Generative Adversarial Networks
暂无分享,去创建一个
[1] W. Marsden. I and J , 2012 .
[2] Yoon Kim,et al. Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.
[3] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[4] Sanja Fidler,et al. Efficient Summarization with Read-Again and Copy Mechanism , 2016, ArXiv.
[5] Chin-Yew Lin,et al. ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.
[6] S. Chitrakala,et al. A survey on extractive text summarization , 2017, 2017 International Conference on Computer, Communication and Signal Processing (ICCCSP).
[7] Alexander M. Rush,et al. Abstractive Sentence Summarization with Attentive Recurrent Neural Networks , 2016, NAACL.
[8] Ji Wang,et al. Pretraining-Based Natural Language Generation for Text Summarization , 2019, CoNLL.
[9] Richard Socher,et al. A Deep Reinforced Model for Abstractive Summarization , 2017, ICLR.
[10] Bowen Zhou,et al. SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents , 2016, AAAI.
[11] Hung-yi Lee,et al. Learning to Encode Text as Human-Readable Summaries using Generative Adversarial Networks , 2018, EMNLP.
[12] Haitao Huang,et al. Abstractive text summarization using LSTM-CNN based deep learning , 2018, Multimedia Tools and Applications.
[13] Phil Blunsom,et al. Language as a Latent Variable: Discrete Generative Models for Sentence Compression , 2016, EMNLP.
[14] Christos Mousas,et al. Generative Adversarial Network with Policy Gradient for Text Summarization , 2019, 2019 IEEE 13th International Conference on Semantic Computing (ICSC).
[15] Marc'Aurelio Ranzato,et al. Sequence Level Training with Recurrent Neural Networks , 2015, ICLR.
[16] Min Yang,et al. Generative Adversarial Network for Abstractive Text Summarization , 2017, AAAI.
[17] Lantao Yu,et al. SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient , 2016, AAAI.
[18] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[19] Holger Schwenk,et al. Supervised Learning of Universal Sentence Representations from Natural Language Inference Data , 2017, EMNLP.