PurposeSegmentation of ovarian/adnexal masses from surrounding tissue on ultrasound images is a challenging task. The separation of masses into different components may also be important for radiomic feature extraction. Our study aimed to develop an artificial intelligence-based automatic segmentation method for transvaginal ultrasound images that (1) outlines the exterior boundary of adnexal masses and (2) separates internal components.ApproachA retrospective ultrasound imaging database of adnexal masses was reviewed for exclusion criteria at the patient, mass, and image levels, with one image per mass. The resulting 54 adnexal masses (36 benign/18 malignant) from 53 patients were separated by patient into training (26 benign/12 malignant) and independent test (10 benign/6 malignant) sets. U-net segmentation performance on test images compared to expert detailed outlines was measured using the Dice similarity coefficient (DSC) and the ratio of the Hausdorff distance to the effective diameter of the outline (RHD-D) for each mass. Subsequently, in discovery mode, a two-level fuzzy c-means (FCM) unsupervised clustering approach was used to separate the pixels within masses belonging to hypoechoic or hyperechoic components.ResultsThe DSC (median [95% confidence interval]) was 0.91 [0.78, 0.96], and RHD-D was 0.04 [0.01, 0.12], indicating strong agreement with expert outlines. Clinical review of the internal separation of masses into echogenic components demonstrated a strong association with mass characteristics.ConclusionA combined U-net and FCM algorithm for automatic segmentation of adnexal masses and their internal components achieved excellent results compared with expert outlines and review, supporting future radiomic feature-based classification of the masses by components.
Radiomic ultrasound-based artificial intelligence (AI) tools may improve adnexal mass evaluations by introducing more quantitative assessments. Detailed segmentation of the lesions is the first step in a radiomics AI classification pipeline. However, accurate outlining is a difficult task, prone to error, and time-consuming. We aimed to develop an automatic method to reduce variability and improve clinical workflow. To evaluate the robustness of using retrospective data, we investigated whether images with sonographic measurement markups interfere with automatic segmentations. A retrospective dataset of 296 images from 106 adnexal lesions (53 benign/53 malignant) was separated by patient into training (19 benign/17 malignant; 89 images) and independent test (34 benign/36 malignant; 207 images) sets. The U-Net was trained twice using images with and without markups. Images were cropped to 20 pixels surrounding the outline and resized to 256x256 pixels. The training set was augmented using flips and rotations. U-Net segmentation performance was compared to expert outlines using the Dice Similarity Coefficient (DSC) and the average Hausdorff distance (HD) and compared using the median and 95% confidence interval (CI) of the difference, with statistical significance indicated if the 95% CI did not cross zero. The median DSC and HD on the independent test set when markups were included in training were 0.92 and 14.4, respectively, and 0.91 and 17.1, respectively, without markups in training. The differences were 0.008 [95% CI: -0.033, 0.056] for DSC and -1.23 [95% CI: -13.5, 7.33] for HD, indicating no evidence for statistically significant difference in performance. Using a U-net algorithm to automatically outline adnexal lesions had excellent agreement with expert outlines, independent of measurement markups presence,supporting AI pipeline development to differentiate between benign and malignant adnexal masses.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.