Sensors (Apr 2023)

Information Extraction Network Based on Multi-Granularity Attention and Multi-Scale Self-Learning

  • Weiwei Sun,
  • Shengquan Liu,
  • Yan Liu,
  • Lingqi Kong,
  • Zhaorui Jian

DOI
https://doi.org/10.3390/s23094250
Journal volume & issue
Vol. 23, no. 9
p. 4250

Abstract

Read online

Transforming the task of information extraction into a machine reading comprehension (MRC) framework has shown promising results. The MRC model takes the context and query as the inputs to the encoder, and the decoder extracts one or more text spans as answers (entities and relationships) from the text. Existing approaches typically use multi-layer encoders, such as Transformers, to generate hidden features of the source sequence. However, increasing the number of encoder layers can lead to the granularity of the representation becoming coarser and the hidden features of different words becoming more similar, potentially leading to the model’s misjudgment. To address this issue, a new method called the multi-granularity attention multi-scale self-learning network (MAML-NET) is proposed, which enhances the model’s understanding ability by utilizing different granularity representations of the source sequence. Additionally, MAML-NET can independently learn task-related information from both global and local dimensions based on the learned multi-granularity features through the proposed multi-scale self-learning attention mechanism. The experimental results on two information extraction tasks, named entity recognition and entity relationship extraction, demonstrated that the method was superior to the method based on machine reading comprehension and achieved the best performance on the five benchmark tests.

Keywords