CAAI Transactions on Intelligence Technology (Sep 2023)

Novel multi‐domain attention for abstractive summarisation

  • Chunxia Qu,
  • Ling Lu,
  • Aijuan Wang,
  • Wu Yang,
  • Yinong Chen

DOI
https://doi.org/10.1049/cit2.12117
Journal volume & issue
Vol. 8, no. 3
pp. 796 – 806

Abstract

Read online

Abstract The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models' small perspective. In order to make up these disadvantages, a multi‐domain attention pointer (MDA‐Pointer) abstractive summarisation model is proposed in this work. First, the model uses bidirectional long short‐term memory to encode, respectively, the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level. Furthermore, the multi‐domain attention mechanism between the semantic representations and the summary word is established, and the proposed model can generate summary words under the proposed attention mechanism based on the words and sentences. Then, the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary, and the coverage mechanism is introduced, respectively, into word and sentence level to reduce the redundancy of summary content. Finally, experiment validation is conducted on CNN/Daily Mail dataset. ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively, and the results verify the validation of model proposed by this paper.

Keywords