Transrectal ultrasound (TRUS) fusion-guided biopsy and brachytherapy (BT) offer promising diagnostic and therapeutic improvements to conventional practice for prostate cancer. One key component of these procedures is accurate segmentation of the prostate in three-dimensional (3D) TRUS images to define margins used for accurate targeting and guidance techniques. However, manual prostate segmentation is a time-consuming and difficult process that must be completed by the physician intraoperatively, often while the patient is under sedation (biopsy) or anesthetic (BT). Providing physicians with a quick and accurate prostate segmentation immediately after acquiring a 3D TRUS image could benefit multiple minimally invasive prostate interventional procedures and greatly reduce procedure time. Our solution to this limitation is the development of a convolutional neural network to segment the prostate in 3D TRUS images using multiple commercial ultrasound systems. Training of a modified U-Net was performed on 84 end-fire and 122 side-fire 3D TRUS images acquired during clinical biopsy and BT procedures. Our approach for 3D segmentation involved prediction on 2D radial slices, which were reconstructed into a 3D geometry. Manual contours provided the annotations needed for the training, validation, and testing datasets, with the testing dataset consisting of 20 unseen 3D side-fire images. Pixel map comparisons (Dice similarity coefficient (DSC), recall, and precision) and volume percent difference (VPD) were computed to assess error in the segmentation algorithm. Our algorithm performed with a 93.5% median DSC and 5.89% median VPD with a <0.7 s computation time, offering the possibility for reduced treatment time during prostate interventional procedures.
High-dose-rate interstitial gynecologic brachytherapy requires multiple needles to be inserted into the tumor and surrounding area, avoiding nearby healthy organs-at-risk (OARs), including the bladder and rectum. We propose the use of a 360° three-dimensional (3D) transvaginal ultrasound (TVUS) guidance system for visualization of needles and report on the implementation of two automatic needle segmentation algorithms to aid the localization of needles intraoperatively. Two-dimensional (2D) needle segmentation, allowing for immediate adjustments to needle trajectories to mitigate needle deflection and avoid OARs, was implemented in near real-time using a method based on a convolutional neural network with a U-Net architecture trained on a dataset of 2D ultrasound images from multiple applications with needle-like structures. In 18 unseen TVUS images, the median position difference [95% confidence interval] was 0.27 [0.20, 0.68] mm and mean angular difference was 0.50 [0.27, 1.16]° between manually and algorithmically segmented needles. Automatic needle segmentation was performed in 3D TVUS images using an algorithm leveraging the randomized 3D Hough transform. All needles were accurately localized in a proof-of-concept image with a median position difference of 0.79 [0.62, 0.93] mm and median angular difference of 0.46 [0.31, 0.62]°, when compared to manual segmentations. Further investigation into the robustness of the algorithm to complex cases containing large shadowing, air, or reverberation artefacts is ongoing. Intraoperative automatic needle segmentation in interstitial gynecologic brachytherapy has the potential to improve implant quality and provides the potential for 3D ultrasound to be used for treatment planning, eliminating the requirement for post-insertion CT scans.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.