Future Internet (Nov 2018)

A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification

  • Yonghua Zhu,
  • Xun Gao,
  • Weilin Zhang,
  • Shenkai Liu,
  • Yuanyuan Zhang

DOI
https://doi.org/10.3390/fi10120116
Journal volume & issue
Vol. 10, no. 12
p. 116

Abstract

Read online

The prevalence that people share their opinions on the products and services in their daily lives on the Internet has generated a large quantity of comment data, which contain great business value. As for comment sentences, they often contain several comment aspects and the sentiment on these aspects are different, which makes it meaningless to give an overall sentiment polarity of the sentence. In this paper, we introduce Attention-based Aspect-level Recurrent Convolutional Neural Network (AARCNN) to analyze the remarks at aspect-level. The model integrates attention mechanism and target information analysis, which enables the model to concentrate on the important parts of the sentence and to make full use of the target information. The model uses bidirectional LSTM (Bi-LSTM) to build the memory of the sentence, and then CNN is applied to extracting attention from memory to get the attentive sentence representation. The model uses aspect embedding to analyze the target information of the representation and finally the model outputs the sentiment polarity through a softmax layer. The model was tested on multi-language datasets, and demonstrated that it has better performance than conventional deep learning methods.

Keywords