IEEE Access (Jan 2023)

Toward Transformer Fusions for Chinese Sentiment Intensity Prediction in Valence-Arousal Dimensions

  • Yu-Chih Deng,
  • Yih-Ru Wang,
  • Sin-Horng Chen,
  • Lung-Hao Lee

DOI
https://doi.org/10.1109/ACCESS.2023.3322436
Journal volume & issue
Vol. 11
pp. 109974 – 109982

Abstract

Read online

BERT (Bidirectional Encoder Representations from Transformers) uses an encoder architecture with an attention mechanism to construct a transformer-based neural network. In this study, we develop a Chinese word-level BERT to learn contextual language representations and propose a transformer fusion framework for Chinese sentiment intensity prediction in the valence-arousal dimensions. Experimental results on the Chinese EmoBank indicate that our transformer-based fusion model outperforms other neural-network-based, regression-based and lexicon-based methods, reflecting the effectiveness of integrating semantic representations in different degrees of linguistic granularity. Our proposed transformer fusion framework is also simple and easy to fine-tune over different downstream tasks.

Keywords