Attention Guided U-Net With Atrous Convolution for Accurate Retinal Vessels Segmentation

The accuracy of retinal vessels segmentation is of great significance for the diagnosis of cardiovascular diseases such as diabetes and hypertension. Especially, the segmentation accuracy of the end of vessels will be affected by the area outside the retinal in fundus image. In this paper, we propos...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 8; pp. 32826 - 32839
Main Authors Lv, Yan, Ma, Hui, Li, Jianian, Liu, Shuangcai
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The accuracy of retinal vessels segmentation is of great significance for the diagnosis of cardiovascular diseases such as diabetes and hypertension. Especially, the segmentation accuracy of the end of vessels will be affected by the area outside the retinal in fundus image. In this paper, we propose an attention guided U-Net with atrous convolution(AA-UNet), which guides the model to separate vessel and non-vessel pixels and reuses deep features. Firstly, AA-UNet regresses a boundary box to the retinal region to generate an attention mask, which was used as a weighting function to multiply the differential feature map in the model to make the model pay more attention to the vessels region. Secondly, atrous convolution replaces ordinary convolution in feature layer, which can increase the receptive field and reduce the amount of computation. Then, we add two shortcuts to the atrous convolution in order to reuse the features, so that the details of vessel are more prominent. We test our model with the accuracy are 0.9558/0.9640/0.9608 and AUC are 0.9847/0.9824/0.9865 on DRIVE, STARE and CHASE_DB1 datasets, respectively. The results show that our method has improvement in the accuracy of retinal vessels segmentation, and exceeded other representative retinal vessels segmentation methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.2974027