PlotMachines: Outline-Conditioned Generation with Dynamic Plot State Tracking

We propose the task of outline-conditioned story generation: given an outline as a set of phrases that describe key characters and events to appear in a story, the task is to generate a coherent narrative that is consistent with the provided outline. This task is challenging as the input only provides a rough sketch of the plot, and thus, models need to generate a story by weaving through the key points provided in the outline. This requires the model to keep track of the dynamic states of the latent plot, conditioning on the input outline while generating the full story. We present PlotMachines, a neural narrative model that learns to transform an outline into a coherent story by tracking the dynamic plot states. In addition, we enrich PlotMachines with high-level discourse structure so that the model can learn different styles of writing corresponding to different parts of the narrative. Comprehensive experiments over three fiction and non-fiction datasets demonstrate that recently introduced large-scale language models, such as GPT-2 and Grover, despite their impressive generation performance, are not sufficient in generating coherent narratives for the given outline, and dynamic plot state tracking is important for composing narratives with tighter, more consistent plots.

[1]  Anirban Laha,et al.  Story Generation from Sequence of Independent Short Descriptions , 2017, ArXiv.

[2]  Rui Yan,et al.  i, Poet: Automatic Poetry Composition through Recurrent Neural Networks with Iterative Polishing Schema , 2016, IJCAI.

[3]  Ali Farhadi,et al.  Defending Against Neural Fake News , 2019, NeurIPS.

[4]  Mark O. Riedl,et al.  Toward Automated Story Generation with Markov Chain Monte Carlo Methods and Deep Neural Networks , 2021, AIIDE Workshops.

[5]  Nick Cramer,et al.  Automatic Keyword Extraction from Individual Documents , 2010 .

[6]  Mark O. Riedl,et al.  Event Representations for Automated Story Generation with Deep Neural Nets , 2017, AAAI.

[7]  Enhong Chen,et al.  Chinese Poetry Generation with Planning based Neural Network , 2016, COLING.

[8]  Chris Callison-Burch,et al.  Unsupervised Hierarchical Story Infilling , 2019, Proceedings of the First Workshop on Narrative Understanding.

[9]  Chin-Yew Lin,et al.  ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.

[10]  Boyang Li,et al.  Story Generation with Crowdsourced Plot Graphs , 2013, AAAI.

[11]  Milica Gasic,et al.  The Hidden Information State model: A practical framework for POMDP-based spoken dialogue management , 2010, Comput. Speech Lang..

[12]  Robert Michael Young,et al.  Narrative Planning: Balancing Plot and Character , 2010, J. Artif. Intell. Res..

[13]  Omer Levy,et al.  Simulating Action Dynamics with Neural Process Networks , 2017, ICLR.

[14]  Yann Dauphin,et al.  Hierarchical Neural Story Generation , 2018, ACL.

[15]  Lav R. Varshney,et al.  CTRL: A Conditional Transformer Language Model for Controllable Generation , 2019, ArXiv.

[16]  Yejin Choi,et al.  Globally Coherent Text Generation with Neural Checklist Models , 2016, EMNLP.

[17]  Christopher D. Manning,et al.  Do Massively Pretrained Language Models Make Better Storytellers? , 2019, CoNLL.

[18]  Xu Sun,et al.  A Skeleton-Based Model for Promoting Coherence Among Sentences in Narrative Story Generation , 2018, EMNLP.

[19]  Yejin Choi,et al.  The Curious Case of Neural Text Degeneration , 2019, ICLR.

[20]  Deyu Zhou,et al.  Neural Storyline Extraction Model for Storyline Generation from News Articles , 2018, NAACL.

[21]  Rafael Pérez y Pérez,et al.  MEXICA: A computer model of a cognitive account of creative writing , 2001, J. Exp. Theor. Artif. Intell..

[22]  Daniel Jurafsky,et al.  A Hierarchical Neural Autoencoder for Paragraphs and Documents , 2015, ACL.

[23]  Ian Lane,et al.  BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer , 2019, INTERSPEECH.

[24]  Marc Cavazza,et al.  Controlling Narrative Generation with Planning Trajectories: The Role of Constraints , 2009, ICIDS.

[25]  Alec Radford,et al.  Improving Language Understanding by Generative Pre-Training , 2018 .

[26]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[27]  Jason Weston,et al.  Tracking the World State with Recurrent Entity Networks , 2016, ICLR.

[28]  Ilya Sutskever,et al.  Language Models are Unsupervised Multitask Learners , 2019 .

[29]  Dongyan Zhao,et al.  Plan-And-Write: Towards Better Automatic Storytelling , 2018, AAAI.

[30]  Xin Wang,et al.  Towards Generating Long and Coherent Text with Multi-Level Latent Variable Models , 2019, ACL.

[31]  Lei Zheng,et al.  Texygen: A Benchmarking Platform for Text Generation Models , 2018, SIGIR.

[32]  Jason Weston,et al.  End-To-End Memory Networks , 2015, NIPS.

[33]  Yann Dauphin,et al.  Strategies for Structuring Story Generation , 2019, ACL.

[34]  Mark O. Riedl,et al.  Controllable Neural Story Plot Generation via Reward Shaping , 2019, IJCAI.

[35]  Steve J. Young,et al.  Bayesian update of dialogue state: A POMDP framework for spoken dialogue systems , 2010, Comput. Speech Lang..

[36]  Yejin Choi,et al.  Generating Topical Poetry , 2016, EMNLP.

[37]  Nanyun Peng,et al.  Towards Controllable Story Generation , 2018 .

[38]  Mark O. Riedl Story Planning: Creativity Through Exploration, Retrieval, and Analogical Transformation , 2010, Minds and Machines.

[39]  David Vandyke,et al.  Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems , 2015, EMNLP.

[40]  Sungjin Lee,et al.  Structured Discriminative Model For Dialog State Tracking , 2013, SIGDIAL Conference.