Reinforcement Learning for Abstractive Question Summarization with Question-aware Semantic Rewards
暂无分享,去创建一个
Asma Ben Abacha | Dina Demner-Fushman | Deepak Gupta | Shweta Yadav | Dina Demner-Fushman | S. Yadav | D. Gupta
[1] Vaibhava Goel,et al. Self-Critical Sequence Training for Image Captioning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[2] Mirella Lapata,et al. Text Summarization with Pretrained Encoders , 2019, EMNLP.
[3] Deepak Gupta,et al. NLM at MEDIQA 2021: Transfer Learning-based Approaches for Consumer Question and Multi-Answer Summarization , 2021, BIONLP.
[4] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[5] Dina Demner-Fushman,et al. Overview of the MEDIQA 2021 Shared Task on Summarization in the Medical Domain , 2021, BIONLP.
[6] Asma Ben Abacha,et al. On the Role of Question Summarization and Information Source Restriction in Consumer Health Question Answering. , 2019, AMIA Joint Summits on Translational Science proceedings. AMIA Joint Summits on Translational Science.
[7] Ming Zhou,et al. ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training , 2020, FINDINGS.
[8] Frank Hutter,et al. SGDR: Stochastic Gradient Descent with Warm Restarts , 2016, ICLR.
[9] Yifan He,et al. damo_nlp at MEDIQA 2021: Knowledge-based Preprocessing and Coverage-oriented Reranking for Medical Question Summarization , 2021, BIONLP.
[10] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.
[11] Richard Socher,et al. A Deep Reinforced Model for Abstractive Summarization , 2017, ICLR.
[12] Asma Ben Abacha,et al. Question-aware Transformer Models for Consumer Health Question Summarization , 2021, J. Biomed. Informatics.
[13] Jiaxin Pei,et al. MATINF: A Jointly Labeled Large-Scale Dataset for Classification, Question Answering and Summarization , 2020, ACL.
[14] Mohit Bansal,et al. Addressing Semantic Drift in Question Generation for Semi-Supervised Question Answering , 2019, EMNLP.
[15] Chin-Yew Lin,et al. ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.
[16] John Canny,et al. The Summary Loop: Learning to Write Abstractive Summaries Without Examples , 2020, ACL.
[17] Yoshua Bengio,et al. Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.
[18] Ronald J. Williams,et al. Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning , 2004, Machine Learning.
[19] Asma Ben Abacha,et al. On the Summarization of Consumer Health Questions , 2019, ACL.
[20] Halil Kilicoglu,et al. Interpreting Consumer Health Questions: The Role of Anaphora and Ellipsis , 2013, BioNLP@ACL.
[21] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[22] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[23] Yao Zhao,et al. PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization , 2020, ICML.
[24] Armen Aghajanyan,et al. Better Fine-Tuning by Reducing Representational Collapse , 2020, ICLR.
[25] Franck Dernoncourt,et al. A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents , 2018, NAACL.
[26] Yen-Chun Chen,et al. Fast Abstractive Summarization with Reinforce-Selected Sentence Rewriting , 2018, ACL.
[27] Nazli Goharian,et al. Ontology-Aware Clinical Abstractive Summarization , 2019, SIGIR.
[28] Marc'Aurelio Ranzato,et al. Sequence Level Training with Recurrent Neural Networks , 2015, ICLR.
[29] Christopher D. Manning,et al. Optimizing the Factual Correctness of a Summary: A Study of Summarizing Radiology Reports , 2020, ACL.
[30] Christopher D. Manning,et al. Get To The Point: Summarization with Pointer-Generator Networks , 2017, ACL.
[31] Ramakanth Pasunuru,et al. Multi-Reward Reinforced Summarization with Saliency and Entailment , 2018, NAACL.
[32] Nazli Goharian,et al. Attend to Medical Ontologies: Content Selection for Clinical Abstractive Summarization , 2020, ACL.
[33] Furu Wei,et al. MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers , 2020, NeurIPS.
[34] P. Bhattacharyya,et al. Reinforced Multi-task Approach for Multi-hop Question Generation , 2020, COLING.
[35] Omer Levy,et al. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension , 2019, ACL.
[36] Ulf Leser,et al. WBI at MEDIQA 2021: Summarizing Consumer Health Questions with Generative Transformers , 2021, BIONLP.