Story Centaur: Large Language Model Few Shot Learning as a Creative Writing Tool
暂无分享,去创建一个
Sherol Chen | Kory Mathewson | Ben Swanson | Ben Pietrzak | Monica Dinalescu | Sherol Chen | Benjamin Swanson | K. Mathewson | Ben Pietrzak | Monica Dinalescu
[1] Mark O. Riedl,et al. Story Realization: Expanding Plot Events into Sentences , 2020, AAAI.
[2] Piotr W. Mirowski,et al. Improvised Theatre Alongside Artificial Intelligences , 2021, AIIDE.
[3] Noah Wardrip-Fruin,et al. Cozy mystery construction kit: prototyping toward an AI-assisted collaborative storytelling mystery game , 2019, FDG.
[4] Ilya Sutskever,et al. Learning to Generate Reviews and Discovering Sentiment , 2017, ArXiv.
[5] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[6] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[7] Xing Shi,et al. Hafez: an Interactive Poetry Generation System , 2017, ACL.
[8] David C. Uthus,et al. TextSETTR: Label-Free Text Style Extraction and Tunable Targeted Restyling , 2021, ArXiv.
[9] Chris Callison-Burch,et al. Unsupervised Hierarchical Story Infilling , 2019, Proceedings of the First Workshop on Narrative Understanding.
[10] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[11] Nicky Case,et al. How To Become A Centaur , 2018 .
[12] Alec Radford,et al. Improving Language Understanding by Generative Pre-Training , 2018 .
[13] Mark O. Riedl,et al. Improvisational Computational Storytelling in Open Worlds , 2016, ICIDS.
[14] Simon Colton,et al. Explainable Computational Creativity , 2020, ICCC.
[15] Mark Chen,et al. Language Models are Few-Shot Learners , 2020, NeurIPS.
[16] Yann Dauphin,et al. Hierarchical Neural Story Generation , 2018, ACL.
[17] Emily M. Bender,et al. Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data , 2020, ACL.
[18] Piotr W. Mirowski,et al. Human Improvised Theatre Augmented with Artificial Intelligence , 2019, Creativity & Cognition.
[19] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[20] Omer Levy,et al. SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems , 2019, NeurIPS.
[21] Sungwoo Lee,et al. I Lead, You Help but Only with Enough Details: Understanding User Experience of Co-Creation with Artificial Intelligence , 2018, CHI.
[22] Jianfeng Gao,et al. Towards Coherent and Cohesive Long-form Text Generation , 2018, Proceedings of the First Workshop on Narrative Understanding.
[23] Emily M. Bender,et al. On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜 , 2021, FAccT.