Frontiers in Public Health (May 2022)

Automated Multi-View Multi-Modal Assessment of COVID-19 Patients Using Reciprocal Attention and Biomedical Transform

  • Yanhan Li,
  • Hongyun Zhao,
  • Hongyun Zhao,
  • Tian Gan,
  • Yang Liu,
  • Lian Zou,
  • Ting Xu,
  • Xuan Chen,
  • Cien Fan,
  • Meng Wu

DOI
https://doi.org/10.3389/fpubh.2022.886958
Journal volume & issue
Vol. 10

Abstract

Read online

Automated severity assessment of coronavirus disease 2019 (COVID-19) patients can help rationally allocate medical resources and improve patients' survival rates. The existing methods conduct severity assessment tasks mainly on a unitary modal and single view, which is appropriate to exclude potential interactive information. To tackle the problem, in this paper, we propose a multi-view multi-modal model to automatically assess the severity of COVID-19 patients based on deep learning. The proposed model receives multi-view ultrasound images and biomedical indices of patients and generates comprehensive features for assessment tasks. Also, we propose a reciprocal attention module to acquire the underlying interactions between multi-view ultrasound data. Moreover, we propose biomedical transform module to integrate biomedical data with ultrasound data to produce multi-modal features. The proposed model is trained and tested on compound datasets, and it yields 92.75% for accuracy and 80.95% for recall, which is the best performance compared to other state-of-the-art methods. Further ablation experiments and discussions conformably indicate the feasibility and advancement of the proposed model.

Keywords