Controlling Linguistic Style Aspects in Neural Language Generation

Most work on neural natural language generation (NNLG) focus on controlling the content of the generated text. We experiment with controlling several stylistic aspects of the generated text, in addition to its content. The method is based on conditioned RNN language model, where the desired content as well as the stylistic parameters serve as conditioning contexts. We demonstrate the approach on the movie reviews domain and show that it is successful in generating coherent sentences corresponding to the required linguistic style and content.

[1]  Ralph Grishman,et al.  Paraphrasing for Style , 2012, COLING.

[2]  Ehud Reiter,et al.  Generating Texts in Different Styles , 2010, The Structure of Style.

[3]  Ondrej Dusek,et al.  Sequence-to-Sequence Generation for Spoken Dialogue via Deep Syntax Trees and Strings , 2016, ACL.

[4]  Eduard Hovy,et al.  Generating Natural Language Under Pragmatic Constraints , 1988 .

[5]  Graham Neubig,et al.  Controlling Output Length in Neural Encoder-Decoders , 2016, EMNLP.

[6]  James Pustejovsky,et al.  A Computational Theory of Prose Style for Natural Language Generation , 1985, EACL.

[7]  Ilya Sutskever,et al.  Learning to Generate Reviews and Discovering Sentiment , 2017, ArXiv.

[8]  Jianfeng Gao,et al.  A Persona-Based Neural Conversation Model , 2016, ACL.

[9]  David Grangier,et al.  Neural Text Generation from Structured Data with Application to the Biography Domain , 2016, EMNLP.

[10]  Ani Nenkova,et al.  Inducing Lexical Style Properties for Paraphrase and Genre Differentiation , 2015, NAACL.

[11]  Yejin Choi,et al.  Globally Coherent Text Generation with Neural Checklist Models , 2016, EMNLP.

[12]  Cécile Paris,et al.  Phrasing a Text in Terms the User Can Understand , 1989, IJCAI.

[13]  Susan Conrad,et al.  Register, Genre, and Style: Registers, genres, and styles: fundamental varieties of language , 2009 .

[14]  Yifan Yang,et al.  Context-aware Natural Language Generation with Recurrent Neural Networks , 2016, ArXiv.

[15]  J. Pennebaker,et al.  Linguistic Style Matching in Social Interaction , 2002 .

[16]  Sharad Vikram,et al.  Generative Concatenative Nets Jointly Learn to Write and Classify Reviews , 2015, 1511.03683.

[17]  Marilyn A. Walker,et al.  Controlling User Perceptions of Linguistic Style: Trainable Generation of Personality Traits , 2011, CL.

[18]  Rico Sennrich,et al.  Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.

[19]  Gholamreza Haffari,et al.  Incorporating Side Information into Recurrent Neural Network Language Models , 2016, NAACL.

[20]  Matthew R. Walter,et al.  What to talk about and how? Selective Generation using LSTMs with Coarse-to-Fine Alignment , 2015, NAACL.

[21]  David Vandyke,et al.  Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems , 2015, EMNLP.

[22]  Ondrej Dusek,et al.  A Context-aware Natural Language Generator for Dialogue Systems , 2016, SIGDIAL Conference.

[23]  Diana Inkpen,et al.  Generation of Formal and Informal Sentences , 2011, ENLG.

[24]  Sharad Vikram,et al.  Capturing Meaning in Product Reviews with Character-Level Generative Text Models , 2015, ArXiv.

[25]  Ani Nenkova,et al.  Detecting Information-Dense Texts in Multiple News Domains , 2014, AAAI.

[26]  Donia Scott,et al.  Generating Texts with Style , 2003, CICLing.

[27]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[28]  Joel R. Tetreault,et al.  An Empirical Analysis of Formality in Online Communication , 2016, TACL.

[29]  Rico Sennrich,et al.  Controlling Politeness in Neural Machine Translation via Side Constraints , 2016, NAACL.

[30]  Eric P. Xing,et al.  Controllable Text Generation , 2017, ArXiv.

[31]  Lukás Burget,et al.  Recurrent neural network based language model , 2010, INTERSPEECH.

[32]  Philip Gage,et al.  A new algorithm for data compression , 1994 .