Applied Sciences (Sep 2024)

Machine Reading Comprehension Model Based on Fusion of Mixed Attention

  • Yanfeng Wang,
  • Ning Ma,
  • Zechen Guo

DOI
https://doi.org/10.3390/app14177794
Journal volume & issue
Vol. 14, no. 17
p. 7794

Abstract

Read online

To address the problems of the insufficient semantic fusion between text and questions and the lack of consideration of global semantic information encountered in machine reading comprehension models, we proposed a machine reading comprehension model called BERT_hybrid based on the BERT and hybrid attention mechanism. In this model, BERT is utilized to separately map the text and questions into the feature space. Through the integration of Bi-LSTM, an attention mechanism, and a self-attention mechanism, the proposed model achieves a comprehensive semantic fusion between text and questions. The probability distribution of answers is computed using Softmax. The experimental results on the public dataset DuReader demonstrate that the proposed model achieves improvements in BLEU-4 and ROUGE-L scores compared to existing models. Furthermore, to validate the effectiveness of the proposed model design, we analyze the factors influencing the model’s performance.

Keywords