Elongated Physiological Structure Segmentation via Spatial and Scale Uncertainty-aware Network

Robust and accurate segmentation for elongated physiological structures is challenging, especially in the ambiguous region, such as the corneal endothelium microscope image with uneven illumination or the fundus image with disease interference. In this paper, we present a spatial and scale uncertain...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Zhang, Yinglin, Xi, Ruiling, Fu, Huazhu, Towey, Dave, Bai, RuiBin, Higashita, Risa, Liu, Jiang
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 30.05.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Robust and accurate segmentation for elongated physiological structures is challenging, especially in the ambiguous region, such as the corneal endothelium microscope image with uneven illumination or the fundus image with disease interference. In this paper, we present a spatial and scale uncertainty-aware network (SSU-Net) that fully uses both spatial and scale uncertainty to highlight ambiguous regions and integrate hierarchical structure contexts. First, we estimate epistemic and aleatoric spatial uncertainty maps using Monte Carlo dropout to approximate Bayesian networks. Based on these spatial uncertainty maps, we propose the gated soft uncertainty-aware (GSUA) module to guide the model to focus on ambiguous regions. Second, we extract the uncertainty under different scales and propose the multi-scale uncertainty-aware (MSUA) fusion module to integrate structure contexts from hierarchical predictions, strengthening the final prediction. Finally, we visualize the uncertainty map of final prediction, providing interpretability for segmentation results. Experiment results show that the SSU-Net performs best on cornea endothelial cell and retinal vessel segmentation tasks. Moreover, compared with counterpart uncertainty-based methods, SSU-Net is more accurate and robust.
ISSN:2331-8422