Recurrent Neural Network for Storytelling

Storytelling is the act of passing on what you want to tell other people as so interesting and true-to-life story. As the study in text mining progresses to express words, sentences and paragraphs as vector, it is possible to classify text and generate text using vectors. However, it has not much progressed to generate a correct flow of context and a correct grammar of context in text mining. In this paper, we propose first neural network model that learn one sentence and one vector is mapped for representing vectors as sentences. And we propose second neural network model that learn the contents of stories for generating sentences. Finally we evaluate the proposed model can generate sentences having stories than other method using ROUGE.

[1]  Lukás Burget,et al.  Extensions of recurrent neural network language model , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[2]  Samy Bengio,et al.  Generating Sentences from a Continuous Space , 2015, CoNLL.

[3]  Geoffrey E. Hinton,et al.  Generating Text with Recurrent Neural Networks , 2011, ICML.

[4]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[5]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[6]  Rong Jin,et al.  Understanding bag-of-words model: a statistical framework , 2010, Int. J. Mach. Learn. Cybern..

[7]  Quoc V. Le,et al.  Distributed Representations of Sentences and Documents , 2014, ICML.