Journal of Clinical and Translational Science (Apr 2022)
466 Convolutional Neural Networks and Machine Learning in the Identification of Ultrasonographic Features of Ovarian Morphology
Abstract
OBJECTIVES/GOALS: To develop a two-staged convolutional neural network to identify the ovary and antral follicles within ovarian ultrasound images and determine its reliability and feasibility compared to conventional techniques in 2D and 3D ultrasonography image analysis. METHODS/STUDY POPULATION: De-identified and archived ultrasonographic images of women across the reproductive spectrum (N=500) will be used in the study. These ultrasound images will be labeled by experienced raters to train a two-staged convolutional neural network (CU-Net). CU-Net will first separate the entire ovary from the background and subsequently identify all antral follicles within the ovary. Following training, the CU-Net will evaluate a second set of independent images (N=100) to determine performance accuracy. Three specialized raters will establish the reliability and feasibility of CU-Net compared to conventional 2D and 3D ovarian ultrasound image analysis methods. RESULTS/ANTICIPATED RESULTS: The labeled training dataset of ovarian ultrasound images is expected to successfully train the CU-Net and allow for accurate identification of the ovary and the total number of antral follicles in the second testing set of ultrasound images. When compared to conventional 2D and 3D ultrasound image analysis methods, CU-Net is expected to have similar accuracy when compared to the gold-standard method (2D-Offline with Grid) and outperform other approaches, such as 2D-Real Time and 3D volume software (VOCAL and Sono-AVC). However, CU-Net is anticipated to be the fastest and most reliable method across users, supporting its clinical feasibility. DISCUSSION/SIGNIFICANCE: This study will immediately translate to providing a standardized platform that can improve the accuracy, reliability, and time demand required for the evaluation of ovarian ultrasounds across users and clinical and research settings.