IEEE Access (Jan 2019)

Bi-Level Attention Model for Sentiment Analysis of Short Texts

  • Wei Liu,
  • Guoxi Cao,
  • Jianqin Yin

DOI
https://doi.org/10.1109/ACCESS.2019.2936457
Journal volume & issue
Vol. 7
pp. 119813 – 119822

Abstract

Read online

Short text is an important form of information dissemination and opinion expression in various social media platforms. Sentiment analysis of short texts is beneficial for the understanding of customers' emotional state, obtaining customers' opinions and attitudes toward events, information and products, however, is difficult because the sparsity of the short-text data. Unlike the traditional methods using the external knowledge, this paper proposes a bi-level attention model for sentiment analysis of short texts, which does not rely on external knowledge to deal with the data sparsity. Specifically, at word level, our model improves the effect of word representation by introducing latent topic information into word-level semantic representation. Neural topic model is used to discover the latent topic of the text. A new topic-word attention mechanism is presented to explore the semantics of words from the perspective of topic-word association; At the sequence level, a secondary attention mechanism is used to capture the relationship between local and global sentiment expression. Experiments on the ChnSentiCorp-Htl-ba-10000 and NLPCC-ECGC datasets validate the effectiveness of the BAM model.

Keywords