Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization (Dec 2024)

Transfer learning for deep neural networks-based classification of breast cancer X-ray images

  • Tuan Linh Le,
  • My Hanh Bui,
  • Ngoc Cuong Nguyen,
  • Manh Toan Ha,
  • Anh Nguyen,
  • Hoang Phuong Nguyen

DOI
https://doi.org/10.1080/21681163.2023.2275708
Journal volume & issue
Vol. 12, no. 1

Abstract

Read online

Nowadays, deep neural networks (DNNs) are helpful tools for mammogram classification in breast cancer screening. But, in Vietnam, there is a relatively small number of mammograms for training DNNs. Therefore, this study aims to apply transfer learning techniques to improve the performance of DNN models. In the first step, 10,418 breast cancer images from the Digital Database for Screening Mammography were used for training the CNN model ResNet 34. In the second step, we fine-tune this model on the Hanoi Medical University (HMU) database with 6,248 Vietnamese mammograms. The optimal model of ResNet 34 among these models achieves a macAUC of 0.766 in classifying breast cancer X-ray images into three Breast Imaging-Reporting and Data System (BI-RADS) categories, BI-RADS 045 (‘incomplete and malignance’), BI-RADS 1 (‘normal’), and BI-RADS 23 (‘benign’), when tested on the test dataset. This result is higher than the result of the ResNet 50 model trained only on the X-ray dataset of 7,912 breast cancer images of the HMU dataset, which achieves a macAUC of 0.754. A comparison of the performance of the proposed model of ResNet 34 applying transfer learning with other works shows that our model’s evaluation results are higher than those of the compared models.

Keywords