Prediction of Premature Retinopathy Fundus Images Using Dense Network Model for Intelligent Portable Screening Device
Retinopathy of Prematurity (ROP) is a serious retinal condition that affects preterm babies and, if ignored, can result in irreversible blindness. The challenges are related to variability and inconsistency among observers in diagnosing ROP, so the development of an automated system for ROP predicti...
Saved in:
Published in | Journal of wireless mobile networks, ubiquitous computing and dependable applications Vol. 15; no. 2; pp. 170 - 182 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
29.06.2022
|
Online Access | Get full text |
Cover
Loading…
Summary: | Retinopathy of Prematurity (ROP) is a serious retinal condition that affects preterm babies and, if ignored, can result in irreversible blindness. The challenges are related to variability and inconsistency among observers in diagnosing ROP, so the development of an automated system for ROP prediction becomes imperative. While various methods have been explored for automated ROP diagnosis, dedicated models with satisfactory performance have been lacking. This study aims to address these gaps with the objective to construct a multi-channel dense Convolutional Neural Network (MCD-CNN) which is tailored for ROP prediction, suitable for large-scale infant screening. The process involves utilizing CLAHE pre-processing, image labelling, image denoising, making and image generation for retinal vessel prediction in fundus images. The multi-channel CNN uses the feature selection method to extract and choose features from pre-processed pictures. The findings show that the suggested model attains a noteworthy 97.5% accuracy, 98% sensitivity, and 98.5% specificity. Significantly, this outperforms both pre-trained models and deep learning classifiers. Overall, the study contributes to improving ROP diagnosis and fostering accessibility to healthcare, particularly in remote areas. |
---|---|
ISSN: | 2093-5374 2093-5382 |
DOI: | 10.58346/JOWUA.2024.I2.012 |