IEEE Access (Jan 2019)

PnP-AdaNet: Plug-and-Play Adversarial Domain Adaptation Network at Unpaired Cross-Modality Cardiac Segmentation

  • Qi Dou,
  • Cheng Ouyang,
  • Cheng Chen,
  • Hao Chen,
  • Ben Glocker,
  • Xiahai Zhuang,
  • Pheng-Ann Heng

DOI
https://doi.org/10.1109/ACCESS.2019.2929258
Journal volume & issue
Vol. 7
pp. 99065 – 99076

Abstract

Read online

Deep convolutional networks have demonstrated state-of-the-art performance on various challenging medical image processing tasks. Leveraging images from different modalities for the same analysis task holds large clinical benefits. However, the generalization capability of deep networks on test data sampled from different distribution remains as a major challenge. In this paper, we propose a plug-and-play adversarial domain adaptation network (PnP-AdaNet) for adapting segmentation networks between different modalities of medical images, e.g., MRI and CT. We tackle the significant domain shift by aligning the feature spaces of source and target domains at multiple scales in an unsupervised manner. With the adversarial loss, we learn a domain adaptation module which flexibly replaces the early encoder layers of the source network, and the higher layers are shared between two domains. We validate our domain adaptation method on cardiac segmentation in unpaired MRI and CT, with four different anatomical structures. The average Dice achieved 63.9%, which is a significant recover from the complete failure (Dice score of 13.2%) if we directly test an MRI segmentation network on CT data. In addition, our proposed PnP-AdaNet outperforms many state-of-the-art unsupervised domain adaptation approaches on the same dataset. The experimental results with comprehensive ablation studies have demonstrated the excellent efficacy of our proposed method for unsupervised cross-modality domain adaptation. Our code is publically available at https://github.com/carrenD/Medical-Cross-Modality-Domain-Adaptation.

Keywords