IET Image Processing (Feb 2023)

Global weighted average pooling network with multilevel feature fusion for weakly supervised brain tumor segmentation

  • Zi‐Wei Li,
  • Shi‐Bin Xuan,
  • Xue‐Dong He,
  • Li Wang

DOI
https://doi.org/10.1049/ipr2.12642
Journal volume & issue
Vol. 17, no. 2
pp. 418 – 427

Abstract

Read online

Abstract Medical image segmentation plays a vital role in computer‐aided diagnosis and intelligent medical treatment. It can preprocess medical images to help doctors better diagnose diseases. Class activation map (CAM) is an important technology in weakly supervised segmentation, which can achieve image segmentation without pixel‐level label training. This technology can well meet the needs of medical image segmentation. However, CAM obtaining is still unperfect due to global average pooling (GAP). GAP will cause important and non‐important regions to be given equal attention during the training process. So, CAM cannot demarcate the boundary of the target regions well. In order to solve this problem, a global weighted average pooling network fusing the grayscale information of medical images is proposed. The proposed network can solve the problem that GAP has the same concern for important regions and non‐important regions of the feature map, because the different weights can be learned for different positions of the feature map before the GAP in the proposed model. At the same time, because of the grayscale difference between the tumor area and the non‐tumor area in the brain tumor image, the low‐level grayscale information of the medical image is fused with the high‐level semantic information extracted by the network to learn the weights. This operation gives full play to the advantages of feature maps of different levels. The experiment results on the popular medical image dataset BraTS2019 show that the proposed method can well improve the performance of CAM and help CAM fit the boundaries of objects. Meanwhile, in the DSC evaluation, the proposed method achieves a score of 64.1%, which is a 4.6% improvement over a recent research method.