Graph Domain Adversarial Transfer Network for Cross-Domain Sentiment Classification

In the text sentiment classification task, some words are seemingly unrelated to the classification task, but they have a direct impact on the performance of classification model. For example, in the sentences “I have terminal cancer” and “Cancer is a very common disease”, it can be clearly found that the word “cancer” has two different sentiment tendencies in the daily life domain and the medical domain. In the daily life domain, the word “cancer” shows an extremely negative sentiment tendency. While in the medical domain, the word “cancer” is just a simple term with a relatively neutral sentiment tendency. Although current deep learning models have already achieved good performance through their powerful feature learning capabilities, there are serious deficiencies in dealing with the above problem. Therefore, from a new perspective, this paper proposes the Graph Domain Adversarial Transfer Network (GDATN) based on the idea of adversarial learning, which uses the labeled source domain data to predict the sentiment label of unlabeled target domain data. Firstly, GDATN extracts feature representations through the Bidirectional Long Short-Term Memory (BiLSTM) Network and Graph Attention Network (GAT) successively. Then, GDATN introduces the domain classifier to capture the domain-shared text feature representation with the Gradient Reversal Layer (GRL). In addition, an auxiliary task named the projection mechanism is constructed to further capture the domain-specific text feature representation in response to the text domain problem. Extensive experimental results on two benchmark datasets show that GDATN proposed in this paper outperforms the other six benchmark sentiment classification models, and GDATN has a better stability on different cross-domain pairs.

[1]  Pushpak Bhattacharyya,et al.  Identifying Transferable Information Across Domains for Cross-domain Sentiment Classification , 2018, ACL.

[2]  Roi Reichart,et al.  Neural Structural Correspondence Learning for Domain Adaptation , 2016, CoNLL.

[3]  Yoshua Bengio,et al.  Domain Adaptation for Large-Scale Sentiment Classification: A Deep Learning Approach , 2011, ICML.

[4]  Christopher D. Manning,et al.  Graph Convolution over Pruned Dependency Trees Improves Relation Extraction , 2018, EMNLP.

[5]  Qiang Yang,et al.  Cross-domain sentiment classification via spectral feature alignment , 2010, WWW '10.

[6]  S. Rigatti Random Forest. , 2017, Journal of insurance medicine.

[7]  Christopher D. Manning,et al.  Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.

[8]  François Laviolette,et al.  Domain-Adversarial Training of Neural Networks , 2015, J. Mach. Learn. Res..

[9]  John Blitzer,et al.  Biographies, Bollywood, Boom-boxes and Blenders: Domain Adaptation for Sentiment Classification , 2007, ACL.

[10]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[11]  Yu Zhang,et al.  Hierarchical Attention Transfer Network for Cross-Domain Sentiment Classification , 2018, AAAI.

[12]  Yoon Kim,et al.  Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.

[13]  Yue Zhang,et al.  Attention Modeling for Targeted Sentiment , 2017, EACL.

[14]  Jun Yan,et al.  Active Sentiment Domain Adaptation , 2017, ACL.

[15]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[16]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[17]  Yuan Luo,et al.  Graph Convolutional Networks for Text Classification , 2018, AAAI.

[18]  Lei Wang,et al.  Convolutional Recurrent Neural Networks for Text Classification , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[19]  Qiang Yang,et al.  Transferring Naive Bayes Classifiers for Text Classification , 2007, AAAI.

[20]  Yu Zhang,et al.  End-to-End Adversarial Memory Network for Cross-domain Sentiment Classification , 2017, IJCAI.

[21]  Xiao Ma,et al.  Enhancing Attention-Based LSTM With Position Context for Aspect-Level Sentiment Classification , 2019, IEEE Access.

[22]  Haitong Zhang,et al.  Wasserstein based transfer network for cross-domain sentiment classification , 2020, Knowl. Based Syst..

[23]  Xuanjing Huang,et al.  Recurrent Neural Network for Text Classification with Multi-Task Learning , 2016, IJCAI.

[24]  Jianfei Yu,et al.  Leveraging Auxiliary Tasks for Document-Level Cross-Domain Sentiment Classification , 2017, IJCNLP.

[25]  Min Yang,et al.  Attention Based LSTM for Target Dependent Sentiment Classification , 2017, AAAI.

[26]  Chen Zhang,et al.  Aspect-based Sentiment Classification with Aspect-specific Graph Convolutional Networks , 2019, EMNLP/IJCNLP.

[27]  Yiming Yang,et al.  Transformer-XL: Attentive Language Models beyond a Fixed-Length Context , 2019, ACL.

[28]  Ao Feng,et al.  Target-Dependent Sentiment Classification With BERT , 2019, IEEE Access.

[29]  Partha Talukdar,et al.  Dating Documents using Graph Convolution Networks , 2018, ACL.

[30]  Danushka Bollegala,et al.  A comparative study of pivot selection strategies for unsupervised cross-domain sentiment classification , 2018, The Knowledge Engineering Review.

[31]  Tieyun Qian,et al.  Transfer Capsule Network for Aspect Level Sentiment Classification , 2019, ACL.

[32]  Luyao Huang,et al.  Utilizing BERT for Aspect-Based Sentiment Analysis via Constructing Auxiliary Sentence , 2019, NAACL.

[33]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[34]  Kilian Q. Weinberger,et al.  Marginalized Denoising Autoencoders for Domain Adaptation , 2012, ICML.

[35]  Roi Reichart,et al.  Pivot Based Language Modeling for Improved Neural Domain Adaptation , 2018, NAACL.

[36]  Jun Zhao,et al.  Recurrent Convolutional Neural Networks for Text Classification , 2015, AAAI.