By exploiting the energy dependence of photoelectric and Compton interactions, dual-energy CT (DECT) can be used to derive a number of parameters based on physical properties, such as relative stopping power map (RSPM). The accuracy of dual-energy CT (DECT)-derived parametric maps relies on image noise levels and the severity of artifacts. Suboptimal image quality may degrade the accuracy of physics-based mapping techniques and affect subsequent processing for clinical applications. In this study, we propose a deep-learning-based method to accurately generate relative stopping power map (RSPM) based on the virtual monoenergetic images as an alternative to physics-based dual-energy approaches. For the training target of our deep-learning model, we manually segmented head-and-neck DECT images into brain, bone, fat, soft-tissue, lung and air, and then assigned different RSP values into the corresponding tissue types to generate a reference RSPM. We proposed to integrate a residual block concept into a cycle-consistent generative adversarial network (cycleGAN) framework to learn the nonlinear mapping between DECT 70keV/140keV monoenergetic image pairs and reference RSPM. We evaluated the proposed method with 18 head-and-neck cancer patients. Mean absolute error (MAE) and mean error (ME) were used to quantify the differences between the generated and reference RSPM. The average MAE between generated and reference RSPM was 3.1±0.4 % and the average ME was 1.5±0.5 % for all patients. Compared to the physics-based method, the proposed method could significantly improve RSPM accuracy and had comparable computational efficiency after training.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.