IEEE Access (Jan 2020)

A Self-Attentive Convolutional Neural Networks for Emotion Classification on User-Generated Contents

  • Ying Qian,
  • Weiwei Liu,
  • Jiangping Huang

DOI
https://doi.org/10.1109/ACCESS.2019.2938560
Journal volume & issue
Vol. 8
pp. 154198 – 154208

Abstract

Read online

It is challenging to detect emotions on user-generated contents (UGC) because it tends to have sparse emotional semantics, multiple emotions in the same text, and a fast update of emotional expression. Word embedding can extract high-level features in words to enrich their semantics. The convolutional neural networks (CNNs) can make model training more efficient to adapt to fast-changing emotional expressions, but its original pooling operation can not simultaneously filter out multiple emotional features that are beneficial to classification results, which is not suitable for UGC. In this paper, we propose a self-attentive convolutional neural networks (SACNNs) trained on top of pre-trained word vectors for emotion detection on UGC, which reserves different kinds of emotion information, avoids the loss of emotional aspects in the pooling process of CNNs structure, and increases the interpretability of the model by visualizing the extraction process of features. Meanwhile, the convergence speed of the model training is accelerated, and the model is updated in real time to detect emotions with rich and novel expressions. The proposed model combines the CNNs and the self-attention mechanism, the self-attention mechanism can select key emotional features after convolution. We evaluate the proposed model with two datasets from NLPCC 2014 and SemEval 2018 Task 1. The experimental results show that our model obtain significant performance than baselines in multi-label classification on UGC. In addition, the experimental results and the rationality of the self-attention mechanism are analyzed in detail, and the influential convolutional filter windows are visualized based on attention weights.

Keywords