CAAI Transactions on Intelligence Technology (Dec 2021)

D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction

  • Yuan Huang,
  • Zhixing Li,
  • Wei Deng,
  • Guoyin Wang,
  • Zhimin Lin

DOI
https://doi.org/10.1049/cit2.12033
Journal volume & issue
Vol. 6, no. 4
pp. 417 – 425

Abstract

Read online

Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level syntactic features that consider the dependency between each word and the target entities into the pre‐trained language models are incorporated. Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi‐granularity features for final relation classification. Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets.

Keywords