陆军军医大学学报 (Jul 2023)
Intelligent segmentation of prostate zones in MR images based on Vgg16-Unet
Abstract
Objective To create an intelligent segmentation model of deep learning based on the magnetic resonance (MR) images of the prostate zones. Methods The T2WI sequence data of 33 patients with prostate cancer, including 6 at T1 stage, 15 at T2 stage, 9 at T3 stage, and 3 at T4 stage, receiving MRI scanning in Shanxi Provincial Cancer Hospital from January 2018 to October 2020 were collected. While, the T2WI sequence data of 346 patients with prostate cancer from the Prostate MRI Public Dataset provided by Radboud University Nijmegen Medical Centre, Netherlands were also collected. Then these totally 379 cases were randomly divided into training sets, validation sets and test sets at a ratio of 7 ∶1 ∶2 (265, 38 and 76 cases, respectively). Based on Unet model, Vgg16 was used as the encoder, and multiple layers of convolution were used with the transfer learning strategy at the same time to build a Vgg16-Unet model. Then with a gold standard of the prostate zones (anterior fibrous matrix zone, central zone, peripheral zone, and transition zone) manually segmented by doctors, Dice similarity coefficient (DSC) and 95% Hausdorff surface distance (HD95) were employed to evaluate the segmentation accuracy of the prostate zones on the test set. Results The model achieved relatively higher accuracy of segmentation for anterior fibrous matrix zone, central zone, peripheral zone and transition zone on the test set. The average DSC was 56.95%, 47.28%, 80.78%, and 90.63%, and the average HD95 value was 20.84, 20.02, 15.39 and 11.20 mm, respectively. The volume predicted by the model was consistent with the volume measured by manual segmentation, and their differences were basically within the 95% consistency interval. Conclusion The segmentation accuracy of our Vgg16-Unet model is better than that of 3 classical variants, Unet, Unet++ and ResUnet++. The model can significantly improve the segmentation efficiency of prostate cancer MRI images and reduce the workload of clinicians.
Keywords