Diabetic Retinopathy Detection Using VGG-NIN a Deep Learning Architecture

Diabetic retinopathy (DR) is a disease that damages retinal blood vessels and leads to blindness. Usually, colored fundus shots are used to diagnose this irreversible disease. The manual analysis (by clinicians) of the mentioned images is monotonous and error-prone. Hence, various computer vision ha...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 9; pp. 61408 - 61416
Main Authors Khan, Zubair, Khan, Fiaz Gul, Khan, Ahmad, Rehman, Zia Ur, Shah, Sajid, Qummar, Sehrish, Ali, Farman, Pack, Sangheon
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Diabetic retinopathy (DR) is a disease that damages retinal blood vessels and leads to blindness. Usually, colored fundus shots are used to diagnose this irreversible disease. The manual analysis (by clinicians) of the mentioned images is monotonous and error-prone. Hence, various computer vision hands-on engineering techniques are applied to predict the occurrences of the DR and its stages automatically. However, these methods are computationally expensive and lack to extract highly nonlinear features and, hence, fail to classify DR's different stages effectively. This paper focuses on classifying the DR's different stages with the lowest possible learnable parameters to speed up the training and model convergence. The VGG16, spatial pyramid pooling layer (SPP) and network-in-network (NiN) are stacked to make a highly nonlinear scale-invariant deep model called the VGG-NiN model. The proposed VGG-NiN model can process a DR image at any scale due to the SPP layer's virtue. Moreover, the stacking of NiN adds extra nonlinearity to the model and tends to better classification. The experimental results show that the proposed model performs better in terms of accuracy, computational resource utilization compared to state-of-the-art methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3074422