IEEE Access (Jan 2019)
R-Transformer Network Based on Position and Self-Attention Mechanism for Aspect-Level Sentiment Classification
Abstract
Aspect-level sentiment classification (ASC) is a research hotspot in natural language processing, which aims to infer the sentiment polarity of a particular aspect in an opinion sentence. There are three main influence factors in the aspect-level sentiment classification: the semantic information of the context; the interaction information of the context and aspect; the position information between the aspect and the context. Some researchers have proposed way to solve aspect-level sentiment classification. However, previous work mainly used the average vector of the aspect to calculate the attention score of the context, which introduced the influence of noise words. Moreover, these attention-based approaches simply used relative positions to calculate positional information for contextual and aspect terms and did not provided better semantic information. Based on these above questions, in this paper, we propose the PSRTN model. Firstly, obtaining the position-aware influence propagate between words and aspects by Gaussian kernel and generating the influence vector for each context word. Secondly, capturing global and local information of the context by the R-Transformer, and using the self-attention mechanism to obtain the keywords in the aspect. Finally, context representation of a particular aspect is generated for classification. In order to evaluate the validity of the model, we conduct experiments on SemEval2014 and Twitter. The results show that the accuracy of the PSRTN model can reach 83.8%, 80.9%, and 75.1% on three data sets, respectively.
Keywords