International Journal of Computational Intelligence Systems (Jun 2024)

LB-BMBC: MHBiaffine-CNN to Capture Span Scores with BERT Injected with Lexical Information for Chinese NER

  • Tao Guo,
  • Zhichao Zhang

DOI
https://doi.org/10.1007/s44196-024-00521-9
Journal volume & issue
Vol. 17, no. 1
pp. 1 – 15

Abstract

Read online

Abstract A substantial body of research has shown that introducing lexical information in Chinese Named Entity Recognition (NER) tasks can enhance the semantic and boundary information of Chinese words. However, in most methods, the introduction of lexical information occurs at the model architecture level, which cannot fully leverage the lexicon learning capability of pre-trained models. Therefore, we propose seamless integration of external Lexicon knowledge into the Transformer layer of BERT. Additionally, we have observed that in span-based recognition, adjacent spans have special spatial relationships. To capture this relationship, we extend the work after Biaffine and use Convolutional Neural Networks (CNN) to treat the score matrix as an image, allowing us to interact with the spatial relationships of spans. Our proposed LB-BMBC model was experimented on four publicly available Chinese NER datasets: Resume, Weibo, OntoNotes v4, and MSRA. In particular, during ablation experiments, we found that CNN can significantly improve performance.

Keywords