IEEE Access (Jan 2021)

Multi-Head Self-Attention Transformation Networks for Aspect-Based Sentiment Analysis

  • Yuming Lin,
  • Chaoqiang Wang,
  • Hao Song,
  • You Li

DOI
https://doi.org/10.1109/ACCESS.2021.3049294
Journal volume & issue
Vol. 9
pp. 8762 – 8770

Abstract

Read online

Aspect-based sentiment analysis (ABSA) aims to analyze the sentiment polarity of an input sentence in a certain aspect. Many existing methods of ABSA employ long short-term memory (LSTM) networks and attention mechanism. However, the attention mechanism only models the local certain dependencies of the input information, which fails to capture the global dependence of the inputs. Simply improving the attention mechanism fails to solve the issue of target-sensitive sentiment expression, which has been proven to degrade the prediction effectiveness. In this work, we propose the multi-head self-attention transformation (MSAT) networks for ABSA tasks, which conducts more effective sentiment analysis with target specific self-attention and dynamic target representation. Given a set of review sentences, MSAT applies multi-head target specific self-attention to better capture the global dependence and introduces target-sensitive transformation to effectively tackle the problem of target-sensitive sentiment at first. Second, the part-of-speech (POS) features are integrated into MSAT to capture the grammatical features of sentences. A series of experiments carried on the SemEval 2014 and Twitter datasets show that the proposed model achieves better effectiveness compared with several state-of-the-art methods.

Keywords