Complex & Intelligent Systems (Jul 2022)

Sentence part-enhanced BERT with respect to downstream tasks

  • Chaoming Liu,
  • Wenhao Zhu,
  • Xiaoyu Zhang,
  • Qiuhong Zhai

DOI
https://doi.org/10.1007/s40747-022-00819-1
Journal volume & issue
Vol. 9, no. 1
pp. 463 – 474

Abstract

Read online

Abstract Bidirectional encoder representations from transformers (BERT) have achieved great success in many natural language processing tasks. However, BERT generally takes the embedding of the first token to represent sentence meaning in the tasks such as sentiment analysis and textual similarity, which does not properly treat different sentence parts. Different sentence parts have different levels of importance for different downstream tasks. For example, main parts (subject, predicate, and object) play crucial roles in textual similarity calculation, while secondary parts (adverbial and complement) are more important than the main parts in sentiment analysis. To this end, we propose a sentence part-enhanced BERT (SpeBERT) model that uses sentence parts with respect to downstream tasks to enhance sentence representations. Specifically, we encode sentence parts based on dependency parsing and downstream tasks, and extract embeddings through a pooling operation. Furthermore, we design several fusion strategies to incorporate different embeddings. We evaluate the proposed SpeBERT model on two downstream tasks, sentiment classification, and semantic textual similarity, with six benchmark datasets. The experimental results show that our model achieves better performance than competitor models.

Keywords