Sentiment Analysis Model Based on Structure Attention Mechanism

Since the long short-term memory (LSTM) network is a sequential structure, it is difficult to effectively represent the structural level information of the context. Sentiment analysis based on the original LSTM causes a problem of structural level information loss, and its capacity to capture the context information is finite. To address this problem, we proposed a novel structure-attention-based LSTM as a hierarchical structure model. It may capture relevant information in the context as much as possible. We propose HM (ht matrix) to storage the structural information of the context. Furthermore, we introduce the attention mechanism to realize vector selection. Compared with the original LSTM and normal attention-based sentiment classification methods, our model obtains a higher classification precision. It is proved that the structure-attention-based method proposed in this study has an advantage in capturing the potential semantic structure.