IEEE Access (Jan 2020)

Leveraging Pre-Trained Language Model for Summary Generation on Short Text

  • Shuai Zhao,
  • Fucheng You,
  • Zeng Yuan Liu

DOI
https://doi.org/10.1109/ACCESS.2020.3045748
Journal volume & issue
Vol. 8
pp. 228798 – 228803

Abstract

Read online

Bidirectional Encoder Representations from Transformers represents the latest incarnation of pre-trained language models which have been obtained a satisfactory effect in text summarization tasks. However, it has not achieved good results for the generation of Chinese short text summaries. In this work, we propose a novel short text summary generation model based on keyword templates, which uses templates found in training data to extract keywords to guide summary generation. The experimental results of the LCSTS data set show that our model performs better than the baseline model. The analysis shows that the methods used in our model can generate high-quality summaries.

Keywords