High dose-rate brachytherapy is a typical part of the treatment process for cervical cancer. During this procedure, radioactive sources are placed locally to the malignancy using specialized applicators or interstitial needles. To ensure accurate dose delivery and positive patient outcomes, medical imaging is utilized intra-procedurally to ensure precise placement of the applicator. Previously, the fusion of three-dimensional ultrasound images has been investigated as an alternative volumetric imaging technique during cervical brachytherapy treatments. However, the need to manually register the two three-dimensional ultrasound images offline resulted in excessively large registration errors. To overcome this limitation, we have designed and developed a tracked, automated mechatronic system to inherently register three-dimensional ultrasound images in real-time. We perform a system calibration using an external coordinate system transform and validate the system tracking using a commercial optical tracker. The results of both experiments indicated sub-millimeter system accuracy, indicating the superior performance of our device. Future work for this study includes performing phantom validation experiments and translating our device into clinical work.
High dose rate brachytherapy is a common procedure used in the treatment of gynecological cancers to irradiate malignant tumors while sparing the surrounding healthy tissue. While treatment may be delivered using a variety of applicator types, a hybrid technique consisting of an intracavitary applicator and interstitial needles allows for highly localized placement of the radioactive sources. To ensure an accurate and precise procedure, identification of the applicator and needle tips is necessary. The use of three-dimensional (3D) transrectal ultrasound (TRUS) and transabdominal ultrasound (TAUS) imaging has been previously investigated for the visualization of the intracavitary applicators. However, due to image artifacts from the applicator, needle tip identification is severely restricted when using a single 3D US view. To overcome this limitation and improve treatment outcome, we propose the use of image fusion to combine TRUS and TAUS images for the complete visualization of the applicator, needle tips, and surrounding anatomy. In this proof-of-concept work, we use a multimodality anthropomorphic pelvic phantom to assess the feasibility of image fusion and needle visualization using a hybrid brachytherapy applicator. We found that fused 3D US images resulted in accurate visualization of the pertinent structures when compared with magnetic resonance images. The results of this study demonstrate the future potential of image fusion in gynecological brachytherapy applications to ensure high treatment quality and reduce radiation dose to surrounding healthy tissue. This work is currently being expanded to other applicator types and is being applied to patients in a clinical trial.
KEYWORDS: 3D image processing, 3D acquisition, Computed tomography, Visualization, Tumors, Ultrasonography, Image visualization, 3D image reconstruction, High dynamic range imaging, Imaging systems
Brachytherapy, a type of radiotherapy, may be used to place radioactive sources into or in close proximity to tumors, providing a method for conformally escalating dose in the tumor and the local area surrounding the malignancy. High-dose-rate interstitial brachytherapy of vaginal tumors requires precise placement of multiple needles through holes in a plastic perineal template to deliver treatment while optimizing dose and avoiding overexposure of nearby organs at risk (OARs). Despite the importance of needle placement, image guidance for adaptive, intraoperative needle visualization, allowing misdirected needles to be identified and corrected during insertion, is not standard practice. We have developed a 360-deg three-dimensional (3-D) transvaginal ultrasound (TVUS) system using a conventional probe with a template-compatible custom sonolucent vaginal cylinder and propose its use for intraoperative needle guidance during interstitial gynecologic brachytherapy. We describe the 3-D TVUS mechanism and geometric validation, present mock phantom procedure results, and report on needle localization accuracy in patients. For the six patients imaged, landmark anatomical features and all needles were clearly visible. The implementation of 360-deg 3-D TVUS through a sonolucent vaginal cylinder provides a technique for visualizing needles and OARs intraoperatively during interstitial gynecologic brachytherapy, enabling implants to be assessed and providing the potential for image guidance.
During high-dose-rate (HDR) interstitial brachytherapy of gynecologic malignancies, precise placement of multiple needles is necessary to provide optimal dose to the tumor while avoiding overexposing nearby healthy organs, such as the bladder and rectum. Needles are currently placed based on preoperative imaging and clinical examination but there is currently no standard for intraoperative image guidance. We propose the use of a three-dimensional (3D) ultrasound (US) system incorporating three scanning geometries: 3D transrectal US (TRUS), 360° 3D sidefire transvaginal US (TVUS), and 3D endfire TVUS, to provide an accessible and versatile tool for intraoperative image guidance during interstitial gynecologic brachytherapy. Images are generated in 12 - 20 s by rotating a conventional two-dimensional US probe, providing a reconstructed 3D image immediately following acquisition. Studies of needles in patient images show mean differences in needle positions of 3.82 ± 1.86 mm and 2.36 ± 0.97 mm in TRUS and sidefire TVUS, respectively, when compared to the clinical x-ray computed tomography (CT) images. A proof-of-concept phantom study of the endfire TVUS mode demonstrated a mean positional difference of 1.91 ± 0.24 mm. Additionally, an automatic needle segmentation tool was tested on a 360° 3D TVUS patient image resulting in a mean angular difference of 0.44 ± 0.19 ° and mean positional difference of 0.78 ± 0.17 mm when compared to manually segmented needles. The implementation of 3D US image guidance during HDR interstitial gynecologic brachytherapy provides a versatile intraoperative system with the potential for improved implant quality and reduced risk to nearby organs.
In high-dose-rate (HDR) interstitial gynecologic brachytherapy, needles are positioned into the tumor and surrounding area through a template to deliver radiotherapy. Optimal dose and avoidance of nearby organs requires precise needle placement; however, there is currently no standard method for intra-operative needle visualization or guidance. We have developed and validated a 360° three-dimensional (3D) transvaginal ultrasound (TVUS) system and created a sonolucent vaginal cylinder that is compatible with the current template to accommodate a conventional side-fire ultrasound probe. This probe is rotated inside the hollow sonolucent cylinder to generate a 3D image. We propose the use of this device for intra-operative verification of brachytherapy needle locations. In a feasibility study, the first ever 360° 3D TVUS image of a gynecologic brachytherapy patient was acquired and the image allowed key features, including bladder, rectum, vaginal wall, and bowel, to be visualized with needles clearly identifiable. Three patients were then imaged following needle insertion (28 needles total) and positions of the needles in the 3D TVUS image were compared to the clinical x-ray computed tomography (CT) image, yielding a mean trajectory difference of 1.67 ± 0.75°. The first and last visible points on each needle were selected in each modality and compared; the point pair with the larger distance was selected as the maximum difference in needle position with a mean maximum difference of 2.33 ± 0.78 mm. This study demonstrates that 360° 3D TVUS may be a feasible approach for intra-operative needle localization during HDR interstitial brachytherapy of gynecologic malignancies.
KEYWORDS: Ultrasonography, 3D image processing, Visualization, Distance measurement, Cancer, Oncology, 3D acquisition, Magnetic resonance imaging, Imaging systems, 3D scanning
Treatment for gynaecological cancers often includes brachytherapy; in particular, in high-dose-rate (HDR) interstitial
brachytherapy, hollow needles are inserted into the tumour and surrounding area through a template in order to deliver the
radiation dose. Currently, there is no standard modality for visualizing needles intra-operatively, despite the need for
precise needle placement in order to deliver the optimal dose and avoid nearby organs, including the bladder and rectum.
While three-dimensional (3D) transrectal ultrasound (TRUS) imaging has been proposed for 3D intra-operative needle
guidance, anterior needles tend to be obscured by shadowing created by the template’s vaginal cylinder. We have
developed a 360-degree 3D transvaginal ultrasound (TVUS) system that uses a conventional two-dimensional side-fire
TRUS probe rotated inside a hollow vaginal cylinder made from a sonolucent plastic (TPX). The system was validated
using grid and sphere phantoms in order to test the geometric accuracy of the distance and volumetric measurements in
the reconstructed image. To test the potential for visualizing needles, an agar phantom mimicking the geometry of the
female pelvis was used. Needles were inserted into the phantom and then imaged using the 3D TVUS system. The needle
trajectories and tip positions in the 3D TVUS scan were compared to their expected values and the needle tracks visualized
in magnetic resonance images. Based on this initial study, 360-degree 3D TVUS imaging through a sonolucent vaginal
cylinder is a feasible technique for intra-operatively visualizing needles during HDR interstitial gynaecological
brachytherapy.
Background: High-dose-rate brachytherapy (HDR-BT) is a prostate cancer treatment option involving the insertion of hollow needles into the gland through the perineum to deliver a radioactive source. Conventional needle imaging involves indexing a trans-rectal ultrasound (TRUS) probe in the superior/inferior (S/I) direction, using the axial transducer to produce an image set for organ segmentation. These images have limited resolution in the needle insertion direction (S/I), so the sagittal transducer is used to identify needle tips, requiring a manual registration with the axial view. This registration introduces a source of uncertainty in the final segmentations and subsequent treatment plan. Our lab has developed a device enabling 3D-TRUS guided insertions with high S/I spatial resolution, eliminating the need to align axial and sagittal views.
Purpose: To compare HDR-BT needle tip localization accuracy between 2D and 3D-TRUS.
Methods: 5 prostate cancer patients underwent conventional 2D TRUS guided HDR-BT, during which 3D images were also acquired for post-operative registration and segmentation. Needle end-length measurements were taken, providing a gold standard for insertion depths.
Results: 73 needles were analyzed from all 5 patients. Needle tip position differences between imaging techniques was found to be largest in the S/I direction with mean±SD of -2.5±4.0 mm. End-length measurements indicated that 3D TRUS provided statistically significantly lower mean±SD insertion depth error of -0.2±3.4 mm versus 2.3±3.7 mm with 2D guidance (p < .001).
Conclusions: 3D TRUS may provide more accurate HDR-BT needle localization than conventional 2D TRUS guidance for the majority of HDR-BT needles.
KEYWORDS: Biopsy, 3D image processing, 3D acquisition, Mammography, Breast, Ultrasonography, Imaging systems, Real time imaging, Tissues, Target recognition
A 3D ultrasound (US)-guided biopsy system was developed to supplement stereotactic mammography (SM) with near real-time 3D and real-time 2D US imaging. We have combined features from SM and US guided biopsy, including breast stabilisation, a confined needle trajectory and dual modality imaging. We have evaluated our procedure using breast phantoms, in terms of its accuracy with US-guided biopsy. Phantoms made of animal tissue with embedded phantom 'lesions' allowed us to test the biopsy accuracy of our procedure. We have also registered the SM image space to US image space, and both spaces to the mechanical geometry of the needle trajectory. Evaluation experiments have shown that our US-guided biopsy procedure was capable of placing the needle tip with 0.85 mm accuracy at a target identified in the 3D image. We also identified that we could successfully biopsy artificial lesions that were 3.2 mm in diameter, with a 96% success rate. As an adjunct to stereotactic mammography, we propose that this system could provide more complete information for target identification and real-time monitoring of needle insertion, as well as providing a means for rapid confirmation of biopsy success with 3D ultrasound.A 3D ultrasound (US)-guided biopsy system was developed to supplement stereotactic mammography (SM) with near real-time 3D and real-time 2D US imaging. We have combined features from SM and US guided biopsy, including breast stabilisation, a confined needle trajectory and dual modality imaging. We have evaluated our procedure using breast phantoms, in terms of its accuracy with US-guided biopsy. Phantoms made of animal tissue with embedded phantom 'lesions' allowed us to test the biopsy accuracy of our procedure. We have also registered the SM image space to US image space, and both spaces to the mechanical geometry of the needle trajectory. Evaluation experiments have shown that our US-guided biopsy procedure was capable of placing the needle tip with 0.85 mm accuracy at a target identified in the 3D image. We also identified that we could successfully biopsy artificial lesions that were 3.2 mm in diameter, with a 96% success rate. As an adjunct to stereotactic mammography, we propose that this system could provide more complete information for target identification and real-time monitoring of needle insertion, as well as providing a means for rapid confirmation of biopsy success with 3D ultrasound.
KEYWORDS: Biopsy, Breast, Ultrasonography, 3D acquisition, 3D image processing, Transducers, 3D scanning, 3D metrology, Statistical analysis, Biological research
We introduce a mechanically constrained, 3D ultrasound- guided core-needle breast biopsy device. With modest breast compression, 3D ultrasound scans localize suspicious masses. A biopsy needle is mechanically guided into position for firing into the sampling region. Th needle is parallel to the transducer, allowing real-time guidance during needle insertion. Lesion sampling is verified by another ultrasound image after firing. Two procedures quantified targeting accuracy of this apparatus. First, we biopsied eleven breast phantoms containing 123 embedded, cylindrical lesions constructed from PVA-C (poly(vinyl alcohol) cryogel) with diameters ranging from 1.6 to 15.9mm. Identification of the colored lesion in the biopsy sample and analysis of the post-biopsy US images provided a model for the success rates. Using this, we predict that our apparatus will require six passes to biopsy a 3.0 mm lesion with 99% confidence. For the second experiment, agar phantoms were embedded with four rows of 0.8mm stainless steel beads. A 14-gauge needle was inserted to each bead position seen in a 3D ultrasound scan and the tip position was compared to the pre-insertion bead position. The inter-observer standard errors of measurement were less than 0.15 and 0.28mm for the bead and needle tip positions, respectively. The off-axis 3D 95% confidence intervals were determined to have widths between 0.43 and 1.71mm, depending on direction and bead position.
A major limitation of the use of endoscopes in minimally invasive surgery is the lack of relative context between the endoscope and its surroundings. The purpose of this work is to map endoscopic images to surfaces obtained from 3D preoperative MR or CT data, for assistance in surgical planning and guidance. To test our methods, we acquired pre- operative CT images of a standard brain phantom from which object surfaces were extracted. Endoscopic images were acquired using a neuro-endoscope tracked with an optical tracking system, and the optical properties of the endoscope were characterized using a simple calibration procedure. Registration of the phantom and CT images was accomplished using markers that could be identified both on the physical object and in the pre-operative images. The endoscopic images were rectified for radial lens distortion, and then mapped onto the extracted surfaces via a ray-traced texture- mapping algorithm, which explicitly accounts for surface obliquity. The optical tracker has an accuracy of about 0.3 mm, which allows the endoscope tip to be localized to within mm. The mapping operation allows the endoscopic images to be effectively 'painted' onto the surfaces as they are acquired. Panoramic and stereoscopic visualization and navigation of the painted surfaces may then be reformed from arbitrary orientations, that were not necessarily those from which the original endoscopic views were acquired.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.