IEEE Access (Jan 2024)

Few-Shot Relation Extraction Through Prompt With Relation Information and Multi-Level Contrastive Learning

  • Ye Dong,
  • Rong Yang,
  • Junbao Liu,
  • Xizhong Qin

DOI
https://doi.org/10.1109/ACCESS.2024.3452720
Journal volume & issue
Vol. 12
pp. 123352 – 123361

Abstract

Read online

Few-shot relation extraction uses only limited labeled data to predict relations between entities. Recently, several studies have introduced prompts to better guide models in understanding relations between entities. Although effective, these approaches ignore the hidden interaction information between support instances and relations, which causes the prompts without effective guidance. In addition, due to the limited labeled data, the model cannot get enough information for training, leading to the problems of relation confusion. In this paper, we propose RelPromptCL, a few-shot relation extraction method that consists of Prompt learning with Relation information and Contrastive Learning. Specifically, RelPromptCL first gets more helpful information by utilizing prompt templates with relation information and then fuses the instance features with the relation features to obtain prototype representation. At the same time, the use of multi-level contrastive learning allows the model to discriminate more between different classes and improves the discriminative capability of the model. Finally, the similarity between the query instance and the prototypes is computed for relation classification. We carried out extensive experiments on both public datasets, the FewRel1.0 datasets and the FewRel2.0 datasets. The results clearly show the efficiency of RelPromptCL.

Keywords