Information (Jul 2024)

Multi-Level Attention with 2D Table-Filling for Joint Entity-Relation Extraction

  • Zhenyu Zhang,
  • Lin Shi,
  • Yang Yuan,
  • Huanyue Zhou,
  • Shoukun Xu

DOI
https://doi.org/10.3390/info15070407
Journal volume & issue
Vol. 15, no. 7
p. 407

Abstract

Read online

Joint entity-relation extraction is a fundamental task in the construction of large-scale knowledge graphs. This task relies not only on the semantics of the text span but also on its intricate connections, including classification and structural details that most previous models overlook. In this paper, we propose the incorporation of this information into the learning process. Specifically, we design a novel two-dimensional word-pair tagging method to define the task of entity and relation extraction. This allows type markers to focus on text tokens, gathering information for their corresponding spans. Additionally, we introduce a multi-level attention neural network to enhance its capacity to perceive structure-aware features. Our experiments show that our approach can overcome the limitations of earlier tagging methods and yield more accurate results. We evaluate our model using three different datasets: SciERC, ADE, and CoNLL04. Our model demonstrates competitive performance compared to the state-of-the-art, surpassing other approaches across the majority of evaluated metrics.

Keywords