Fine-Grained Text Sentiment Transfer via Dependency Parsing

Fine-grained sentiment transfer demands to edit an input sentence on a given sentiment intensity while preserving its content, which largely extends traditional binary sentiment transfer. Previous works on sentiment transfer usually attempt to learn latent content representation disentangled from sentiment. However, it is difficult to completely separate these two factors and it is also not necessary. In this paper, we propose a novel model that learns the latent representation without disentanglement and leverages sentiment intensity as input to decoder for fine-grained sentiment control. Moreover, aligned sentences with the same content but different sentiment intensities are usually unavailable. Due to the lack of parallel data, we construct pseudo-parallel sentences (i.e, sentences with similar content but different intensities) to relieve the burden of our model. In specific, motivated by the fact that the sentiment word (e.g., “delicious”) has a close relationship with the non-sentiment context word (e.g., “food”), we use dependency parsing to capture the dependency relationship. The pseudo-parallel sentences are produced by replacing the sentiment word with a new one according to the specific context word. Besides, the difference between pseudo-parallel sentences and generated sentences and other constraints are utilized to guide the model precisely revising sentiment. Experiments on the Yelp dataset show that our method substantially improves the degree of content preservation and sentiment accuracy and achieves stateof-the-art performance.

[1]  Xu Sun,et al.  Learning Sentiment Memories for Sentiment Modification without Parallel Data , 2018, EMNLP.

[2]  Alexei A. Efros,et al.  Image-to-Image Translation with Conditional Adversarial Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[3]  Lili Mou,et al.  Disentangled Representation Learning for Non-Parallel Text Style Transfer , 2018, ACL.

[4]  Shuming Shi,et al.  QuaSE: Sequence Editing under Quantifiable Guidance , 2018, EMNLP.

[5]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[6]  拓海 杉山,et al.  “Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks”の学習報告 , 2017 .

[7]  Alexander M. Rush,et al.  Adversarially Regularized Autoencoders , 2017, ICML.

[8]  Chen Wu,et al.  A Hierarchical Reinforced Sequence Operation Method for Unsupervised Text Style Transfer , 2019, ACL.

[9]  Yu Cheng,et al.  Adversarial Category Alignment Network for Cross-domain Sentiment Classification , 2019, NAACL.

[10]  Eric P. Xing,et al.  Unsupervised Text Style Transfer using Language Models as Discriminators , 2018, NeurIPS.

[11]  Ye Zhang,et al.  SHAPED: Shared-Private Encoder-Decoder for Text Style Adaptation , 2018, NAACL.

[12]  Pan Zhou,et al.  DA-Net: Learning the Fine-Grained Density Distribution With Deformation Aggregation Network , 2018, IEEE Access.

[13]  Jinjun Xiong,et al.  Reinforcement Learning Based Text Style Transfer without Parallel Training Corpus , 2019, NAACL.

[14]  Houfeng Wang,et al.  Unpaired Sentiment-to-Sentiment Translation: A Cycled Reinforcement Learning Approach , 2018, ACL.

[15]  Regina Barzilay,et al.  Style Transfer from Non-Parallel Text by Cross-Alignment , 2017, NIPS.

[16]  Akhilesh Sudhakar,et al.  “Transforming” Delete, Retrieve, Generate Approach for Controlled Text Style Transfer , 2019, EMNLP.

[17]  Andrea Esuli,et al.  SentiWordNet 3.0: An Enhanced Lexical Resource for Sentiment Analysis and Opinion Mining , 2010, LREC.

[18]  Zhifang Sui,et al.  Towards Fine-grained Text Sentiment Transfer , 2019, ACL.

[19]  Yulia Tsvetkov,et al.  Style Transfer Through Back-Translation , 2018, ACL.

[20]  Cícero Nogueira dos Santos,et al.  Fighting Offensive Language on Social Media with Unsupervised Text Style Transfer , 2018, ACL.

[21]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[22]  Dongyan Zhao,et al.  Style Transfer in Text: Exploration and Evaluation , 2017, AAAI.

[23]  Amar Prakash Azad,et al.  Unsupervised Controllable Text Formalization , 2018, AAAI.

[24]  Tao Zhang,et al.  Mask and Infill: Applying Masked Language Model for Sentiment Transfer , 2019, IJCAI.

[25]  Hyunsoo Kim,et al.  Learning to Discover Cross-Domain Relations with Generative Adversarial Networks , 2017, ICML.

[26]  Mihai Surdeanu,et al.  The Stanford CoreNLP Natural Language Processing Toolkit , 2014, ACL.

[27]  Jiaying Liu,et al.  Demystifying Neural Style Transfer , 2017, IJCAI.

[28]  Xuanjing Huang,et al.  Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation , 2019, ACL.

[29]  Salim Roukos,et al.  Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.

[30]  Eric P. Xing,et al.  Toward Controlled Generation of Text , 2017, ICML.

[31]  Cícero Nogueira dos Santos,et al.  Improved Neural Text Attribute Transfer with Non-parallel Data , 2017, ArXiv.

[32]  Percy Liang,et al.  Delete, Retrieve, Generate: a Simple Approach to Sentiment and Style Transfer , 2018, NAACL.

[33]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[34]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.