A hierarchical and parallel framework for End-to-End Aspect-based Sentiment Analysis

Abstract Pipeline, joint, and collapsed models are three major approaches to solving End-to-End Aspect-based Sentiment Analysis (E2E-ABSA) task. Prior works found that joint models were consistently surpassed by the other two. To explore the potential of joint model for E2E-ABSA, we propose a hierarchical and parallel joint framework on the basis of exploiting the hierarchical nature of the pre-trained language model and performing parallel inference of the subtasks. Our framework: (1) shares the same pre-trained backbone network between two subtasks, ensuring the associations and commonalities between them; (2) considers the hierarchical feature of the deep neural network and introduces two joint approaches, namely the specific-layer joint model and multiple-layer joint model, coupling two specific layers or multiple task-related layers with subtasks; (3) carries out parallel execution in both training and inference processes, improving the inference throughput and al-leviating the target-polarity mismatch problem. The experimental results on three benchmark datasets demonstrate that our approach outperforms the state-of-the-art works.

[1]  Zhen Huang,et al.  Open-Domain Targeted Sentiment Analysis via Span-Based Extraction and Classification , 2019, ACL.

[2]  Xin Li,et al.  A Unified Model for Opinion Target Extraction and Target Sentiment Prediction , 2018, AAAI.

[3]  Jesse Vig,et al.  A Multiscale Visualization of Attention in the Transformer Model , 2019, ACL.

[4]  Haris Papageorgiou,et al.  SemEval-2016 Task 5: Aspect Based Sentiment Analysis , 2016, *SEMEVAL.

[5]  Lidong Bing,et al.  Recurrent Attention Network on Memory for Aspect Sentiment Analysis , 2017, EMNLP.

[6]  Saeed Jalili,et al.  Aspects Extraction for Aspect Level Opinion Analysis Based on Deep CNN , 2021, 2021 26th International Computer Conference, Computer Society of Iran (CSICC).

[7]  Xiaokui Xiao,et al.  Recursive Neural Conditional Random Fields for Aspect-based Sentiment Analysis , 2016, EMNLP.

[8]  Robert Frank,et al.  Open Sesame: Getting inside BERT’s Linguistic Knowledge , 2019, BlackboxNLP@ACL.

[9]  Alexander S. Yeh,et al.  More accurate tests for the statistical significance of result differences , 2000, COLING.

[10]  Kiyoaki Shirai,et al.  A Joint Model of Term Extraction and Polarity Classification for Aspect-based Sentiment Analysis , 2018, 2018 10th International Conference on Knowledge and Systems Engineering (KSE).

[11]  Philip S. Yu,et al.  Double Embeddings and CNN-based Sequence Labeling for Aspect Extraction , 2018, ACL.

[12]  Chongyang Shi,et al.  Exploiting BERT with Global-Local Context and Label Dependency for Aspect Term Extraction , 2020, 2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA).

[13]  Akbar Karimi,et al.  Adversarial Training for Aspect-Based Sentiment Analysis with BERT , 2020, 2020 25th International Conference on Pattern Recognition (ICPR).

[14]  Kenton Lee,et al.  Learning Recurrent Span Representations for Extractive Question Answering , 2016, ArXiv.

[15]  Hwee Tou Ng,et al.  An Unsupervised Neural Attention Model for Aspect Extraction , 2017, ACL.

[16]  Hwee Tou Ng,et al.  An Interactive Multi-Task Learning Network for End-to-End Aspect-Based Sentiment Analysis , 2019, ACL.

[17]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[18]  Tao Dai,et al.  Aspect-based sentiment classification with multi-attention network , 2020, Neurocomputing.

[19]  Benoît Sagot,et al.  What Does BERT Learn about the Structure of Language? , 2019, ACL.

[20]  Xin Li,et al.  Deep Multi-Task Learning for Aspect Term Extraction with Memory Interaction , 2017, EMNLP.

[21]  Dipanjan Das,et al.  BERT Rediscovers the Classical NLP Pipeline , 2019, ACL.

[22]  Gerard de Melo,et al.  Sentiment-Aspect Extraction based on Restricted Boltzmann Machines , 2015, ACL.

[23]  Ta-Chun Su,et al.  SesameBERT: Attention for Anywhere , 2020, 2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA).

[24]  Suresh Manandhar,et al.  SemEval-2014 Task 4: Aspect Based Sentiment Analysis , 2014, *SEMEVAL.

[25]  Luke S. Zettlemoyer,et al.  Deep Contextualized Word Representations , 2018, NAACL.

[26]  Suresh Manandhar,et al.  SemEval-2015 Task 12: Aspect Based Sentiment Analysis , 2015, *SEMEVAL.

[27]  Erik Cambria,et al.  Aspect extraction for opinion mining with a deep convolutional neural network , 2016, Knowl. Based Syst..

[28]  Ting Liu,et al.  Aspect Level Sentiment Classification with Deep Memory Network , 2016, EMNLP.

[29]  Heyan Huang,et al.  Earlier Attention? Aspect-Aware LSTM for Aspect Sentiment Analysis , 2019, IJCAI.

[30]  Anna Rumshisky,et al.  Revealing the Dark Secrets of BERT , 2019, EMNLP.

[31]  Xiaocheng Feng,et al.  Effective LSTMs for Target-Dependent Sentiment Classification , 2015, COLING.

[32]  Man Lan,et al.  Towards a One-stop Solution to Both Aspect Extraction and Sentiment Analysis Tasks with Neural Multi-task Learning , 2018, 2018 International Joint Conference on Neural Networks (IJCNN).

[33]  Tetsuya Sakai,et al.  Attitude Detection for One-Round Conversation: Jointly Extracting Target-Polarity Pairs , 2019, WSDM.

[34]  Md. Shad Akhtar,et al.  Multi-task learning for aspect term extraction and aspect sentiment classification , 2020, Neurocomputing.

[35]  Houfeng Wang,et al.  Interactive Attention Networks for Aspect-Level Sentiment Classification , 2017, IJCAI.

[36]  Xin Li,et al.  Transformation Networks for Target-Oriented Sentiment Classification , 2018, ACL.

[37]  Christopher D. Manning,et al.  A Structural Probe for Finding Syntax in Word Representations , 2019, NAACL.

[38]  Iryna Gurevych,et al.  Extracting Opinion Targets in a Single and Cross-Domain Setting with Conditional Random Fields , 2010, EMNLP.

[39]  Tao Li,et al.  Aspect Based Sentiment Analysis with Gated Convolutional Networks , 2018, ACL.

[40]  Li Zhao,et al.  Attention-based LSTM for Aspect-level Sentiment Classification , 2016, EMNLP.

[41]  Hai Zhao,et al.  Deepening Hidden Representations from Pre-trained Language Models for Natural Language Understanding , 2019, ArXiv.

[42]  Philip S. Yu,et al.  BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis , 2019, NAACL.

[43]  Anna Rumshisky,et al.  A Primer in BERTology: What We Know About How BERT Works , 2020, Transactions of the Association for Computational Linguistics.

[44]  Jin Wang,et al.  Graph Attention Network with Memory Fusion for Aspect-level Sentiment Analysis , 2020, AACL.

[45]  Benjamin Van Durme,et al.  Open Domain Targeted Sentiment , 2013, EMNLP.

[46]  Mi Zhang,et al.  Convolution over Hierarchical Syntactic and Lexical Graphs for Aspect Level Sentiment Analysis , 2020, EMNLP.

[47]  Lidong Bing,et al.  Exploiting BERT for End-to-End Aspect-based Sentiment Analysis , 2019, EMNLP.

[48]  Shafiq R. Joty,et al.  Fine-grained Opinion Mining with Recurrent Neural Networks and Word Embeddings , 2015, EMNLP.