PurposeUncertainty estimation has gained significant attention in recent years for its potential to enhance the performance of deep learning (DL) algorithms in medical applications and even potentially address domain shift challenges. However, it is not straightforward to incorporate uncertainty estimation with a DL system to achieve a tangible positive effect. The objective of our work is to evaluate if the proposed spatial uncertainty aggregation (SUA) framework may improve the effectiveness of uncertainty estimation in segmentation tasks. We evaluate if SUA boosts the observed correlation between the uncertainty estimates and false negative (FN) predictions. We also investigate if the observed benefits can translate to tangible improvements in segmentation performance.ApproachOur SUA framework processes negative prediction regions from a segmentation algorithm and detects FNs based on an aggregated uncertainty score. It can be utilized with many existing uncertainty estimation methods to boost their performance. We compare the SUA framework with a baseline of processing individual pixel’s uncertainty independently.ResultsThe results demonstrate that SUA is able to detect FN regions. It achieved Fβ=0.5 of 0.92 on the in-domain and 0.85 on the domain-shift test data compared with 0.81 and 0.48 achieved by the baseline uncertainty, respectively. We also demonstrate that SUA yields improved general segmentation performance compared with utilizing the baseline uncertainty.ConclusionsWe propose the SUA framework for incorporating and utilizing uncertainty estimates for FN detection in DL segmentation algorithms for histopathology. The evaluation confirms the benefits of our approach compared with assessing pixel uncertainty independently.
Cameras digitize real-world scenes as pixel intensity values with a limited value range given by the available bits per pixel (bpp). High Dynamic Range (HDR) cameras capture those luminance values in higher resolution through an increase in the number of bpp. Most displays, however, are limited to 8 bpp. Naïve HDR compression methods lead to a loss of the rich information contained in those HDR images. In this paper, tone mapping algorithms for thermal infrared images with 16 bpp are investigated that can preserve this information. An optimized multi-scale Retinex algorithm sets the baseline. This algorithm is then approximated with a deep learning approach based on the popular U-Net architecture. The remaining noise in the images after tone mapping is reduced implicitly by utilizing a self-supervised deep learning approach that can be jointly trained with the tone mapping approach in a multi-task learning scheme. Further discussions are provided on denoising and deflickering for thermal infrared video enhancement in the context of tone mapping. Extensive experiments on the public FLIR ADAS Dataset prove the effectiveness of our proposed method in comparison with the state-of-the-art.
KEYWORDS: Cancer detection, Image segmentation, Education and training, Breast cancer, Pathology, Lymph nodes, Histograms, Deep learning, Data modeling, Visualization
Computational pathology, a developing area of primarily deep learning (DL) solutions aiming to aid pathologists at their daily tasks, has shown promising results in research settings. In recent years, uncertainty estimation has gained substantial recognition as having high potential to bring value to DL algorithms for medical applications. But it is not trivial how to incorporate it with a DL system to obtain a real positive impact. In this work we propose a framework to spatially aggregated epistemic uncertainty in order to detect false negatives produced by a segmentation algorithm of breast cancer metastases. We show a strong correlation between the false negative segmentation areas and the aggregated uncertainty values. Furthermore, the results include examples of reducing false negatives, where the uncertainty approach led to detection of some tumour metastases that had been missed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.