A Cascade U‐Net With Transformer for Retinal Multi‐Lesion Segmentation

ABSTRACT Diabetic retinopathy (DR) is an important cause of blindness. If not diagnosed and treated in a timely manner, it can lead to irreversible vision loss. The diagnosis of DR relies heavily on specialized ophthalmologists. In recent years, with the development of artificial intelligence a numb...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of imaging systems and technology Vol. 34; no. 5
Main Authors Zheng, Haiyang, Liu, Feng
Format Journal Article
LanguageEnglish
Published Hoboken, USA John Wiley & Sons, Inc 01.09.2024
Wiley Subscription Services, Inc
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:ABSTRACT Diabetic retinopathy (DR) is an important cause of blindness. If not diagnosed and treated in a timely manner, it can lead to irreversible vision loss. The diagnosis of DR relies heavily on specialized ophthalmologists. In recent years, with the development of artificial intelligence a number of diagnostics using this technique have begun to appear. One method for diagnosing diseases in this field is to segment four common kinds of lesions from color fundus images, including: exudates (EX), soft exudates (SE), hemorrhages (HE), and microaneurysms (MA). In this paper, we propose a segmentation model for DR based on deep learning. The main part of the model consists of two layers of improved U‐Net network based on transformer, corresponding to the two stages of coarse segmentation and fine segmentation, respectively. The model can segment four common kinds of lesions from the input color fundus image at the same time. To validate the performance of our proposed model, we test our model on three public datasets: IDRiD, DDR, and DIARETDB1. The test results show that our proposed model achieves competitive results compared with the existing methods in terms of PR‐AUC, ROC‐AUC, Dice, and IoU, especially for lesions segmentation of SE and MA.
ISSN:0899-9457
1098-1098
DOI:10.1002/ima.23163