Sensors (Apr 2022)

S-NER: A Concise and Efficient Span-Based Model for Named Entity Recognition

  • Jie Yu,
  • Bin Ji,
  • Shasha Li,
  • Jun Ma,
  • Huijun Liu,
  • Hao Xu

DOI
https://doi.org/10.3390/s22082852
Journal volume & issue
Vol. 22, no. 8
p. 2852

Abstract

Read online

Named entity recognition (NER) is a task that seeks to recognize entities in raw texts and is a precondition for a series of downstream NLP tasks. Traditionally, prior NER models use the sequence labeling mechanism which requires label dependency captured by the conditional random fields (CRFs). However, these models are prone to cascade label misclassifications since a misclassified label results in incorrect label dependency, and so some following labels may also be misclassified. To address the above issue, we propose S-NER, a span-based NER model. To be specific, S-NER first splits raw texts into text spans and regards them as candidate entities; it then directly obtains the types of spans by conducting entity type classifications on span semantic representations, which eliminates the requirement for label dependency. Moreover, S-NER has a concise neural architecture in which it directly uses BERT as its encoder and a feed-forward network as its decoder. We evaluate S-NER on several benchmark datasets across three domains. Experimental results demonstrate that S-NER consistently outperforms the strongest baselines in terms of F1-score. Extensive analyses further confirm the efficacy of S-NER.

Keywords