Attention-based sentiment analysis using convolutional and recurrent neural network

Abstract Convolution and recurrent neural network have obtained remarkable performance in natural language processing(NLP). Moreover, from the attention mechanism perspective convolution neural network(CNN) is applied less than recurrent neural network(RNN). Because RNN can learn long-term dependencies and gives better results than CNN. But CNN has its own advantage, can extract high-level features by using its local fix size context at the input level. Thus, this paper proposed a new model based on RNN with CNN-based attention mechanism by using the merits of both architectures together in one model. In the proposed model, first, CNN learns the high-level features of sentence from input representation. Second, we used attention mechanism to get the attention of the model on the features which contribute much in the prediction task by calculating the attention score from features context generated from CNN filters. Finally, these features context from CNN with attention score are commonly used at the RNN to process them sequentially. To validate the model we experiment on three benchmark datasets. Experiment results and their analysis demonstrate the effectiveness of the model.

[1]  Min Chen,et al.  DeepFocus: Deep Encoding Brainwaves and Emotions with Multi-Scenario Behavior Analytics for Human Attention Enhancement , 2019, IEEE Network.

[2]  Albert Y. Zomaya,et al.  CreativeBioMan: A Brain- and Body-Wearable, Computing-Based, Creative Gaming System , 2020, IEEE Systems, Man, and Cybernetics Magazine.

[3]  Erik Cambria,et al.  Recent Trends in Deep Learning Based Natural Language Processing , 2017, IEEE Comput. Intell. Mag..

[4]  Alexander M. Rush,et al.  Character-Aware Neural Language Models , 2015, AAAI.

[5]  M. Shamim Hossain,et al.  Edge Intelligence in the Cognitive Internet of Things: Improving Sensitivity and Interactivity , 2019, IEEE Network.

[6]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[7]  Nadra Guizani,et al.  DeepNetQoE: Self-Adaptive QoE Optimization Framework of Deep Networks , 2020, IEEE Network.

[8]  M. Shamim Hossain,et al.  Relational User Attribute Inference in Social Media , 2015, IEEE Transactions on Multimedia.

[9]  Soo-Min Kim,et al.  Automatic Detection of Opinion Bearing Words and Sentences , 2005, IJCNLP.

[10]  Eneko Agirre,et al.  Word n-gram attention models for sentence similarity and inference , 2019, Expert Syst. Appl..

[11]  Arun Kumar Sangaiah,et al.  Automatic Generation of News Comments Based on Gated Attention Neural Networks , 2018, IEEE Access.

[12]  Christopher D. Manning,et al.  Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.

[13]  Jiafu Wan,et al.  Deep Learning Based Weighted Feature Fusion Approach for Sentiment Analysis , 2019, IEEE Access.

[14]  Yang Liu,et al.  An Attention-Gated Convolutional Neural Network for Sentence Classification , 2018, Intell. Data Anal..

[15]  Soo-Min Kim,et al.  Automatic Identification of Pro and Con Reasons in Online Reviews , 2006, ACL.

[16]  Bowen Zhou,et al.  Attentive Pooling Networks , 2016, ArXiv.

[17]  Ye Zhang,et al.  A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification , 2015, IJCNLP.

[18]  Shaoqian Li,et al.  6G Wireless Communications: Vision and Potential Techniques , 2019, IEEE Network.

[19]  Yong Zhang,et al.  Attention pooling-based convolutional neural network for sentence modelling , 2016, Inf. Sci..

[20]  Long Hu,et al.  AI-Skin : Skin Disease Recognition based on Self-learning and Wide Data Collection through a Closed Loop Framework , 2019, Inf. Fusion.

[21]  Ausif Mahmood,et al.  Convolutional Recurrent Deep Learning Model for Sentence Classification , 2018, IEEE Access.

[22]  Bowen Zhou,et al.  ABCNN: Attention-Based Convolutional Neural Network for Modeling Sentence Pairs , 2015, TACL.

[23]  Alex Graves,et al.  Neural Turing Machines , 2014, ArXiv.

[24]  M. Shamim Hossain,et al.  Audio–Visual Emotion-Aware Cloud Gaming Framework , 2015, IEEE Transactions on Circuits and Systems for Video Technology.

[25]  Yang Liu,et al.  Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention , 2016, ArXiv.

[26]  Yulan He,et al.  TDAM: a Topic-Dependent Attention Model for Sentiment Analysis , 2019, Inf. Process. Manag..

[27]  Anh-Cuong Le,et al.  Learning multiple layers of knowledge representation for aspect based sentiment analysis , 2017, Data Knowl. Eng..

[28]  Guangping Zeng,et al.  A Lexicon-Enhanced Attention Network for Aspect-Level Sentiment Analysis , 2020, IEEE Access.

[29]  Min Chen,et al.  Wearable Affective Robot , 2018, IEEE Access.

[30]  Peng Li,et al.  Dataset and Neural Recurrent Sequence Labeling Model for Open-Domain Factoid Question Answering , 2016, ArXiv.

[31]  Mirella Lapata,et al.  Long Short-Term Memory-Networks for Machine Reading , 2016, EMNLP.

[32]  Yixue Hao,et al.  Label-less Learning for Emotion Cognition , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[33]  Junping Du,et al.  Abstractive social media text summarization using selective reinforced Seq2Seq attention model , 2020, Neurocomputing.

[34]  Hongyu Guo,et al.  Long Short-Term Memory Over Tree Structures , 2015, ArXiv.