Frontiers in Medicine (Feb 2022)

MEDUSA: Multi-Scale Encoder-Decoder Self-Attention Deep Neural Network Architecture for Medical Image Analysis

  • Hossein Aboutalebi,
  • Hossein Aboutalebi,
  • Maya Pavlova,
  • Hayden Gunraj,
  • Mohammad Javad Shafiee,
  • Mohammad Javad Shafiee,
  • Mohammad Javad Shafiee,
  • Ali Sabri,
  • Amer Alaref,
  • Amer Alaref,
  • Alexander Wong,
  • Alexander Wong,
  • Alexander Wong

DOI
https://doi.org/10.3389/fmed.2021.821120
Journal volume & issue
Vol. 8

Abstract

Read online

Medical image analysis continues to hold interesting challenges given the subtle characteristics of certain diseases and the significant overlap in appearance between diseases. In this study, we explore the concept of self-attention for tackling such subtleties in and between diseases. To this end, we introduce, a multi-scale encoder-decoder self-attention (MEDUSA) mechanism tailored for medical image analysis. While self-attention deep convolutional neural network architectures in existing literature center around the notion of multiple isolated lightweight attention mechanisms with limited individual capacities being incorporated at different points in the network architecture, MEDUSA takes a significant departure from this notion by possessing a single, unified self-attention mechanism with significantly higher capacity with multiple attention heads feeding into different scales in the network architecture. To the best of the authors' knowledge, this is the first “single body, multi-scale heads” realization of self-attention and enables explicit global context among selective attention at different levels of representational abstractions while still enabling differing local attention context at individual levels of abstractions. With MEDUSA, we obtain state-of-the-art performance on multiple challenging medical image analysis benchmarks including COVIDx, Radiological Society of North America (RSNA) RICORD, and RSNA Pneumonia Challenge when compared to previous work. Our MEDUSA model is publicly available.

Keywords