IEEE Access (Jan 2021)

Scale-Aware Transformers for Diagnosing Melanocytic Lesions

  • Wenjun Wu,
  • Sachin Mehta,
  • Shima Nofallah,
  • Stevan Knezevich,
  • Caitlin J. May,
  • Oliver H. Chang,
  • Joann G. Elmore,
  • Linda G. Shapiro

DOI
https://doi.org/10.1109/ACCESS.2021.3132958
Journal volume & issue
Vol. 9
pp. 163526 – 163541

Abstract

Read online

Diagnosing melanocytic lesions is one of the most challenging areas of pathology with extensive intra- and inter-observer variability. The gold standard for a diagnosis of invasive melanoma is the examination of histopathological whole slide skin biopsy images by an experienced dermatopathologist. Digitized whole slide images offer novel opportunities for computer programs to improve the diagnostic performance of pathologists. In order to automatically classify such images, representations that reflect the content and context of the input images are needed. In this paper, we introduce a novel self-attention-based network to learn representations from digital whole slide images of melanocytic skin lesions at multiple scales. Our model softly weighs representations from multiple scales, allowing it to discriminate between diagnosis-relevant and -irrelevant information automatically. Our experiments show that our method outperforms five other state-of-the-art whole slide image classification methods by a significant margin. Our method also achieves comparable performance to 187 practicing U.S. pathologists who interpreted the same cases in an independent study. To facilitate relevant research, full training and inference code is made publicly available at https://github.com/meredith-wenjunwu/ScATNet.

Keywords