Sensors (Feb 2023)

Text Summarization Method Based on Gated Attention Graph Neural Network

  • Jingui Huang,
  • Wenya Wu,
  • Jingyi Li,
  • Shengchun Wang

DOI
https://doi.org/10.3390/s23031654
Journal volume & issue
Vol. 23, no. 3
p. 1654

Abstract

Read online

Text summarization is an information compression technology to extract important information from long text, which has become a challenging research direction in the field of natural language processing. At present, the text summary model based on deep learning has shown good results, but how to more effectively model the relationship between words, more accurately extract feature information and eliminate redundant information is still a problem of concern. This paper proposes a graph neural network model GA-GNN based on gated attention, which effectively improves the accuracy and readability of text summarization. First, the words are encoded using a concatenated sentence encoder to generate a deeper vector containing local and global semantic information. Secondly, the ability to extract key information features is improved by using gated attention units to eliminate local irrelevant information. Finally, the loss function is optimized from the three aspects of contrastive learning, confidence calculation of important sentences, and graph feature extraction to improve the robustness of the model. Experimental validation was conducted on a CNN/Daily Mail dataset and MR dataset, and the results showed that the model in this paper outperformed existing methods.

Keywords