Towards Controllable Story Generation

We present a general framework of analyzing existing story corpora to generate controllable and creative new stories. The proposed framework needs little manual annotation to achieve controllable story generation. It creates a new interface for humans to interact with computers to generate personalized stories. We apply the framework to build recurrent neural network (RNN)-based generation models to control story ending valence1 (Egidi and Gerrig, 2009) and storyline. Experiments show that our methods successfully achieve the control and enhance the coherence of stories through introducing storylines. with additional control factors, the generation model gets lower perplexity, and yields more coherent stories that are faithful to the control factors according to human evaluation.

[1]  Richard J. Gerrig,et al.  How valence affects language processing: Negativity bias and mood congruence in narrative comprehension , 2009, Memory & cognition.

[2]  Selmer Bringsjord,et al.  Artificial Intelligence and Literary Creativity: Inside the Mind of Brutus, A Storytelling Machine , 1999 .

[3]  Reid Swanson,et al.  Say Anything: Using Textual Case-Based Reasoning to Enable Open-Domain Interactive Storytelling , 2012, TIIS.

[4]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[5]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[6]  Regina Barzilay,et al.  Style Transfer from Non-Parallel Text by Cross-Alignment , 2017, NIPS.

[7]  Robert Michael Young,et al.  Narrative Planning: Balancing Plot and Character , 2010, J. Artif. Intell. Res..

[8]  Frank D. Wood,et al.  Learning Disentangled Representations with Semi-Supervised Deep Generative Models , 2017, NIPS.

[9]  Rafael Pérez y Pérez,et al.  MEXICA: A computer model of a cognitive account of creative writing , 2001, J. Exp. Theor. Artif. Intell..

[10]  Yoav Goldberg,et al.  Controlling Linguistic Style Aspects in Neural Language Generation , 2017, ArXiv.

[11]  Marc Cavazza,et al.  Emotional input for character-based interactive storytelling , 2009, AAMAS.

[12]  Alexander M. Rush,et al.  OpenNMT: Open-Source Toolkit for Neural Machine Translation , 2017, ACL.

[13]  Mark O. Riedl,et al.  Event Representations for Automated Story Generation with Deep Neural Nets , 2017, AAAI.

[14]  Nebojsa Jojic,et al.  Steering Output Style and Topic in Neural Response Generation , 2017, EMNLP.

[15]  Raquel Hervás,et al.  Story plot generation based on CBR , 2004, Knowl. Based Syst..

[16]  Boyang Li,et al.  Story Generation with Crowdsourced Plot Graphs , 2013, AAAI.

[17]  Nathanael Chambers,et al.  A Corpus and Cloze Evaluation for Deeper Understanding of Commonsense Stories , 2016, NAACL.

[18]  Pieter Abbeel,et al.  InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets , 2016, NIPS.

[19]  James R. Meehan,et al.  TALE-SPIN, An Interactive Program that Writes Stories , 1977, IJCAI.

[20]  Eric P. Xing,et al.  Toward Controlled Generation of Text , 2017, ICML.

[21]  Sanja Fidler,et al.  Skip-Thought Vectors , 2015, NIPS.

[22]  Dongyan Zhao,et al.  Style Transfer in Text: Exploration and Evaluation , 2017, AAAI.

[23]  Naoya Inoue,et al.  An RNN-based Binary Classifier for the Story Cloze Test , 2017, LSDSem@EACL.

[24]  Xing Shi,et al.  Hafez: an Interactive Poetry Generation System , 2017, ACL.

[25]  Nick Cramer,et al.  Automatic Keyword Extraction from Individual Documents , 2010 .

[26]  S. Turner Minstrel: a computer model of creativity and storytelling , 1993 .

[27]  Guillaume Lample,et al.  Fader Networks: Manipulating Images by Sliding Attributes , 2017, NIPS.

[28]  Rui Yan,et al.  i, Poet: Automatic Poetry Composition through Recurrent Neural Networks with Iterative Polishing Schema , 2016, IJCAI.

[29]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[30]  Ting Liu,et al.  Document Modeling with Gated Recurrent Neural Network for Sentiment Classification , 2015, EMNLP.