Xi'an Gongcheng Daxue xuebao (Apr 2022)

Improvement of transformer summary generation model integrating pointer network

  • LI Weiqian,
  • PU Chenglei

DOI
https://doi.org/10.13338/j.issn.1674-649x.2022.02.013
Journal volume & issue
Vol. 36, no. 2
pp. 94 – 100

Abstract

Read online

The traditional Encoder-Decoder model with attention mechanism has such problems as text redundancy, inconsistent representation and out of vocabulary (OOV) in the application of the summary task, resulting in low accuracy of the generated summary. The transformer model with ambeddable text location information was improved, pointer network was introduced to help with decoding, and take advantage of pointer network to generated text to generate summary. The effectiveness of the transformer model was verified on the LCSTS Chinese short text summary data set. The results show that the model outperforms the benchmark model by an average of two points in ROUGE scores, and the prominence of the generated content and the fluency of the language are significantly improved while ensuring the consistency of the summary with the input text.

Keywords