International Journal of Computational Intelligence Systems (Dec 2020)

Dual Neural Network Fusion Model for Chinese Named Entity Recognition

  • Dandan Zhao,
  • Jingxiang Cao,
  • Degen Huang,
  • Jiana Meng,
  • Pan Zhang

DOI
https://doi.org/10.2991/ijcis.d.201216.001
Journal volume & issue
Vol. 14, no. 1

Abstract

Read online

Chinese named entity recognition (NER) has important effect on natural language processing (NLP) applications. This recognition task is complicated in its strong dependent-relation, missing delimiters in the text and insufficient feature representation in a single model. This paper thus proposes a dual neural network fusion model (DFM) to improve Chinese NER performance. We integrate the traditional bi-directional long-short-term memory (BiLSTM) structure and self-attention mechanism (ATT) with dilated convolutional neural network (DCNN) to better capture context information. Additionally, we exploit the Google's pretrained model named bi-directional encoder representations from transformers (BERT) as the embedding layer. The proposed model has the following merits: (1) a dual neural network architecture is proposed to enhance the robustness of extracted features. (2) An attention mechanism is fused into the dual neural network to extract implicit context representation information in Chinese NER. (3) Dilated convolutions are used to make a tradeoff between performance and executing speed. Experiments show that our proposed model exceeds the state-of-the-art Chinese NER methods.

Keywords