IEEE Access (Jan 2024)

Study on Chinese Semantic Entity Recognition Method for Cabin Utilizing BERT-BiGRU Model

  • Ruina Ma,
  • Hui Cao,
  • Zhihao Song,
  • Xiaoyu Wu

DOI
https://doi.org/10.1109/ACCESS.2024.3386760
Journal volume & issue
Vol. 12
pp. 56042 – 56049

Abstract

Read online

Name Entity Recognition (NER) aims to recognize entities in the engine room domain from unstructured engine room domain text. But in the engine room domain, the entities are diverse and complex, and there is a nesting phenomenon, resulting in a low entity recognition rate. In this paper, a deep learning method incorporating language models is proposed to enhance the entity recognition performance within the engine room. domain. Firstly, the Bidirectional Encoder Representation from Transformers (BERT) language model is employed to train text feature extraction, acquiring a matrix of vector representations at the word level. Secondly, the trained word vectors are fed into the Bidirectional Gated Recurrent Unit (BiGRU) for contextual semantic entity feature extraction. Finally, the global optimal sequence is extracted by combining with the Conditional Random Field (CRF) model to obtain the named entities in the ship cabin semantics. The experimental results show that the proposed algorithm can obtain better F1 values for all three types of entity recognition. Compared with BERT-BiGRU, the overall accuracy of entity identification, recall rate and F1 value are improved by 1.35%, 1.45% and 1.40%, respectively.

Keywords