Enhancing Topic-to-Essay Generation with External Commonsense Knowledge

Automatic topic-to-essay generation is a challenging task since it requires generating novel, diverse, and topic-consistent paragraph-level text with a set of topics as input. Previous work tends to perform essay generation based solely on the given topics while ignoring massive commonsense knowledge. However, this commonsense knowledge provides additional background information, which can help to generate essays that are more novel and diverse. Towards filling this gap, we propose to integrate commonsense from the external knowledge base into the generator through dynamic memory mechanism. Besides, the adversarial training based on a multi-label discriminator is employed to further improve topic-consistency. We also develop a series of automatic evaluation metrics to comprehensively assess the quality of the generated essay. Experiments show that with external commonsense knowledge and adversarial training, the generated essays are more novel, diverse, and topic-consistent than existing methods in terms of both automatic and human evaluation.

[1]  Michihiko Minoh,et al.  Hitch Haiku: An Interactive Supporting System for Composing Haiku Poem , 2008, ICEC.

[2]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[3]  Catherine Havasi,et al.  Representing General Relational Knowledge in ConceptNet 5 , 2012, LREC.

[4]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[5]  Zhifang Sui,et al.  Incorporating Glosses into Neural Word Sense Disambiguation , 2018, ACL.

[6]  Jason Weston,et al.  A Neural Attention Model for Abstractive Sentence Summarization , 2015, EMNLP.

[7]  Wei Wu,et al.  SGM: Sequence Generation Model for Multi-label Classification , 2018, COLING.

[8]  R. J. Williams,et al.  Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning , 2004, Machine Learning.

[9]  Shou-De Lin,et al.  i, Poet: Automatic Chinese Poetry Composition through a Generative Summarization Framework under Constrained Optimization , 2013, IJCAI.

[10]  Lantao Yu,et al.  SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient , 2016, AAAI.

[11]  Mirella Lapata,et al.  Chinese Poetry Generation with Recurrent Neural Networks , 2014, EMNLP.

[12]  Yann Dauphin,et al.  Hierarchical Neural Story Generation , 2018, ACL.

[13]  Dongyan Zhao,et al.  Plan-And-Write: Towards Better Automatic Storytelling , 2018, AAAI.

[14]  Ming Li,et al.  Generating Thematic Chinese Poetry using Conditional Variational Autoencoders with Hybrid Decoders , 2017, IJCAI.

[15]  Salim Roukos,et al.  Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.

[16]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[17]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[18]  Yoon Kim,et al.  Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.

[19]  Jing Li,et al.  Directional Skip-Gram: Explicitly Distinguishing Left and Right Context for Word Embeddings , 2018, NAACL.

[20]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[21]  Razvan Pascanu,et al.  On the difficulty of training recurrent neural networks , 2012, ICML.

[22]  Jason Weston,et al.  End-To-End Memory Networks , 2015, NIPS.

[23]  Ke Wang,et al.  SentiGAN: Generating Sentimental Texts via Mixture Adversarial Networks , 2018, IJCAI.

[24]  Yishay Mansour,et al.  Policy Gradient Methods for Reinforcement Learning with Function Approximation , 1999, NIPS.

[25]  Hannu Toivonen,et al.  Data-Driven News Generation for Automated Journalism , 2017, INLG.

[26]  Jürgen Schmidhuber,et al.  Highway Networks , 2015, ArXiv.

[27]  Rui Yan,et al.  i, Poet: Automatic Poetry Composition through Recurrent Neural Networks with Iterative Polishing Schema , 2016, IJCAI.

[28]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[29]  Xiaocheng Feng,et al.  Topic-to-Essay Generation with Neural Networks , 2018, IJCAI.

[30]  Dongyan Zhao,et al.  Generating Classical Chinese Poems via Conditional Variational Autoencoder and Adversarial Training , 2018, EMNLP.

[31]  Anirban Laha,et al.  Story Generation from Sequence of Independent Short Descriptions , 2017, ArXiv.

[32]  David Vandyke,et al.  Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems , 2015, EMNLP.

[33]  Jianfeng Gao,et al.  A Diversity-Promoting Objective Function for Neural Conversation Models , 2015, NAACL.

[34]  Xu Sun,et al.  A Skeleton-Based Model for Promoting Coherence Among Sentences in Narrative Story Generation , 2018, EMNLP.

[35]  Yang Wang,et al.  Flexible and Creative Chinese Poetry Generation Using Neural Memory , 2017, ACL.

[36]  Enhong Chen,et al.  Chinese Poetry Generation with Planning based Neural Network , 2016, COLING.

[37]  Jie Zhou,et al.  A Dual Reinforcement Learning Framework for Unsupervised Text Style Transfer , 2019, IJCAI.

[38]  Maosong Sun,et al.  Automatic Poetry Generation with Mutual Reinforcement Learning , 2018, EMNLP.

[39]  Xu Sun,et al.  Learning Sentiment Memories for Sentiment Modification without Parallel Data , 2018, EMNLP.