Weak label based Bayesian U-Net for optic disc segmentation in fundus images
Fundus images have been widely used in routine examinations of ophthalmic diseases. For some diseases, the pathological changes mainly occur around the optic disc area; therefore, detection and segmentation of the optic disc are critical pre-processing steps in fundus image analysis. Current machine...
Saved in:
Published in | Artificial intelligence in medicine Vol. 126; p. 102261 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Netherlands
Elsevier B.V
01.04.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Fundus images have been widely used in routine examinations of ophthalmic diseases. For some diseases, the pathological changes mainly occur around the optic disc area; therefore, detection and segmentation of the optic disc are critical pre-processing steps in fundus image analysis. Current machine learning based optic disc segmentation methods typically require manual segmentation of the optic disc for the supervised training. However, it is time consuming to annotate pixel-level optic disc masks and inevitably induces inter-subject variance. To address these limitations, we propose a weak label based Bayesian U-Net exploiting Hough transform based annotations to segment optic discs in fundus images. To achieve this, we build a probabilistic graphical model and explore a Bayesian approach with the state-of-the-art U-Net framework. To optimize the model, the expectation-maximization algorithm is used to estimate the optic disc mask and update the weights of the Bayesian U-Net, alternately. Our evaluation demonstrates strong performance of the proposed method compared to both fully- and weakly-supervised baselines.
•We propose a Bayesian model for optic disc segmentation without manual annotation.•The proposed Bayesian U-Net considers uncertainty introduced by noisy labels.•We explore our Bayesian U-Net under the Expectation-Maximization framework. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0933-3657 1873-2860 1873-2860 |
DOI: | 10.1016/j.artmed.2022.102261 |