暂无分享,去创建一个
Yejin Choi | Ronan Le Bras | Chandra Bhagavatula | Keisuke Sakaguchi | Niket Tandon | Peter Clark | Ronan Le Bras | Yejin Choi | Chandra Bhagavatula | R. L. Bras | Niket Tandon | Peter Clark | Keisuke Sakaguchi | Chandrasekhar Bhagavatula | P. Clark
[1] Omer Levy,et al. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension , 2019, ACL.
[2] Nathanael Chambers,et al. Unsupervised Learning of Narrative Event Chains , 2008, ACL.
[3] Manfred Pinkal,et al. Learning Script Knowledge with Web Experiments , 2010, ACL.
[4] Stephen Clark,et al. What Happens Next? Event Prediction Using a Compositional Neural Network Model , 2016, AAAI.
[5] Gerald DeJong,et al. Learning Schemata for Natural Language Processing , 1985, IJCAI.
[6] Nathanael Chambers,et al. A Corpus and Cloze Evaluation for Deeper Understanding of Commonsense Stories , 2016, NAACL.
[7] Yejin Choi,et al. Globally Coherent Text Generation with Neural Checklist Models , 2016, EMNLP.
[8] Nathanael Chambers. Behind the Scenes of an Evolving Event Cloze Test , 2017, LSDSem@EACL.
[9] Marie-Francine Moens,et al. Skip N-grams and Ranking Functions for Predicting Script Events , 2012, EACL.
[10] Yejin Choi,et al. COMET: Commonsense Transformers for Automatic Knowledge Graph Construction , 2019, ACL.
[11] Nanyun Peng,et al. Towards Controllable Story Generation , 2018 .
[12] Jean-Yves Ramel,et al. An Exact Graph Edit Distance Algorithm for Solving Pattern Recognition Problems , 2015, ICPRAM.
[13] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[14] Benjamin Van Durme,et al. Script Induction as Association Rule Mining , 2020, NUSE.
[15] Francis Ferraro,et al. Script Induction as Language Modeling , 2015, EMNLP.
[16] Sanja Fidler,et al. VirtualHome: Simulating Household Activities Via Programs , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[17] Ivan Titov,et al. Modeling Semantic Expectation: Using Script Knowledge for Referent Prediction , 2017, TACL.
[18] Yejin Choi,et al. COMET-ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs , 2020, AAAI.
[19] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[20] Aman Madaan,et al. Neural Language Modeling for Contextualized Temporal Graph Generation , 2020, NAACL.
[21] Simon Ostermann,et al. InScript: Narrative texts annotated with script information , 2016, LREC.
[22] Jianfeng Gao,et al. PlotMachines: Outline-Conditioned Generation with Dynamic Plot State Tracking , 2020, EMNLP.
[23] Ralph Weischedel,et al. Machine-Assisted Script Curation , 2021, NAACL.
[24] Niranjan Balasubramanian,et al. Hierarchical Quantized Representations for Script Generation , 2018, EMNLP.
[25] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[26] Stefan Thater,et al. A Crowdsourced Database of Event Sequence Descriptions for the Acquisition of High-quality Script Knowledge , 2016, LREC.
[27] Danqi Chen,et al. of the Association for Computational Linguistics: , 2001 .
[28] Ivan Titov,et al. Inducing Neural Models of Script Knowledge , 2014, CoNLL.
[29] Raymond J. Mooney,et al. Statistical Script Learning with Multi-Argument Events , 2014, EACL.
[30] Thomas Wolf,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[31] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[32] Simon Ostermann. Script Knowledge for Natural Language Understanding , 2020 .
[33] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[34] Wei Shi,et al. A Hybrid Model for Globally Coherent Story Generation , 2019, Proceedings of the Second Workshop on Storytelling.