IEEE Access (Jan 2019)

Multi-View Convolutional Neural Networks for Mammographic Image Classification

  • Lilei Sun,
  • Junqian Wang,
  • Zhijun Hu,
  • Yong Xu,
  • Zhongwei Cui

DOI
https://doi.org/10.1109/ACCESS.2019.2939167
Journal volume & issue
Vol. 7
pp. 126273 – 126282

Abstract

Read online

In recent years, deep learning has been widely applied for mammographic image classification. However, most of the existing methods are based on a single mammography view and cannot sufficiently extract discriminative features, thereby resulting in an unsatisfactory classification accuracy. To solve this problem and improve the mammographic image classification performance, we propose a novel multi-view convolutional neural network (CNN) based on multiple mammography views in this paper. Considering that the images acquired from different perspectives contain different and complementary breast mass information, we modify the CNN architecture to exploit the complementary information from the various views of mammography. The new architecture can extract discriminative features from the mediolateral oblique (MLO) and craniocaudal (CC) views of the mammographic images and can effectively incorporate these features for mammographic images. The dilated convolutional layers enable the feature maps extracted from the multiple breast mass views to capture information from a large “field of vision”. Moreover, multi-scale features are obtained by using the convolutional and dilated convolutions. In addition, we incorporate a penalty term into the cross entropy loss function, which enables the model evolution to reduce the misclassification rate by enhancing the contributions of the samples misclassified in the training process. The proposed method was evaluated and compared with several state-of-the-art methods on the open Digital Database for Screening Mammography (DDSM) and Mammographic Image Analysis Society (MIAS) datasets. The experimental results show that the proposed method exhibits a better performance than those of the state-of-the-art methods.

Keywords