Mathematical Biosciences and Engineering (Sep 2023)

Multi-task learning for aspect level semantic classification combining complex aspect target semantic enhancement and adaptive local focus

  • Quan Zhu,
  • Xiaoyin Wang ,
  • Xuan Liu ,
  • Wanru Du,
  • Xingxing Ding

DOI
https://doi.org/10.3934/mbe.2023824
Journal volume & issue
Vol. 20, no. 10
pp. 18566 – 18591

Abstract

Read online

Aspect-based sentiment analysis (ABSA) is a fine-grained and diverse task in natural language processing. Existing deep learning models for ABSA face the challenge of balancing the demand for finer granularity in sentiment analysis with the scarcity of training corpora for such granularity. To address this issue, we propose an enhanced BERT-based model for multi-dimensional aspect target semantic learning. Our model leverages BERT's pre-training and fine-tuning mechanisms, enabling it to capture rich semantic feature parameters. In addition, we propose a complex semantic enhancement mechanism for aspect targets to enrich and optimize fine-grained training corpora. Third, we combine the aspect recognition enhancement mechanism with a CRF model to achieve more robust and accurate entity recognition for aspect targets. Furthermore, we propose an adaptive local attention mechanism learning model to focus on sentiment elements around rich aspect target semantics. Finally, to address the varying contributions of each task in the joint training mechanism, we carefully optimize this training approach, allowing for a mutually beneficial training of multiple tasks. Experimental results on four Chinese and five English datasets demonstrate that our proposed mechanisms and methods effectively improve ABSA models, surpassing some of the latest models in multi-task and single-task scenarios.

Keywords