Applied Sciences (Dec 2023)

Multi-Head Self-Attention-Enhanced Prototype Network with Contrastive–Center Loss for Few-Shot Relation Extraction

  • Jiangtao Ma,
  • Jia Cheng,
  • Yonggang Chen,
  • Kunlin Li,
  • Fan Zhang,
  • Zhanlei Shang

DOI
https://doi.org/10.3390/app14010103
Journal volume & issue
Vol. 14, no. 1
p. 103

Abstract

Read online

Few-shot relation extraction (FSRE) constitutes a critical task in natural language processing (NLP), involving learning relationship characteristics from limited instances to enable the accurate classification of new relations. The existing research primarily concentrates on using prototype networks for FSRE and enhancing their performance by incorporating external knowledge. However, these methods disregard the potential interactions among different prototype networks, and each prototype network can only learn and infer from its limited instances, which may limit the robustness and reliability of the prototype representations. To tackle the concerns outlined above, this paper introduces a novel prototype network called SACT (multi-head self-attention and contrastive-center loss), aimed at obtaining more comprehensive and precise interaction information from other prototype networks to bolster the reliability of the prototype network. Firstly, SACT employs a multi-head self-attention mechanism for capturing interaction information among different prototypes from traditional prototype networks, reducing the noise introduced by unknown categories with a small sample through information aggregation. Furthermore, SACT introduces a new loss function, the contrastive–center loss function, aimed at tightly clustering samples from a similar relationship category in the center of the feature space while dispersing samples from different relationship categories. Through extensive experiments on FSRE datasets, this paper demonstrates the outstanding performance of SACT, providing strong evidence for the effectiveness and practicality of SACT.

Keywords