Nailfold Microhemorrhage Segmentation with Modified U-Shape Convolutional Neural Network

Nailfold capillaroscopy is a reliable way to detect and analyze microvascular abnormalities. It is safe, simple, noninvasive, and inexpensive. Among all the capillaroscopic abnormalities, nailfold microhemorrhages are closely associated with early vascular damages and might be present in numerous di...

Full description

Saved in:
Bibliographic Details
Published inApplied sciences Vol. 12; no. 10; p. 5068
Main Authors Liu, Ruiqi, Tian, Jing, Li, Yuemei, Chen, Na, Yan, Jianshe, Li, Taihao, Liu, Shupeng
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.05.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Nailfold capillaroscopy is a reliable way to detect and analyze microvascular abnormalities. It is safe, simple, noninvasive, and inexpensive. Among all the capillaroscopic abnormalities, nailfold microhemorrhages are closely associated with early vascular damages and might be present in numerous diseases such as glaucoma, diabetes mellitus, and systemic sclerosis. Segmentation of nailfold microhemorrhages provides valuable pathological information that may lead to further investigations. A novel deep learning architecture named DAFM-Net is proposed for the accurate segmentation in this study. The network mainly consists of U-shape backbone, dual attention fusion module, and group normalization layer. The U-shape backbone generates rich hierarchical representations while the dual attention fusion module utilizes the captured features for fine adjustment. Group normalization is introduced as an effective normalization method to effectively improve the convergence ability of our deep neural network. The effectiveness of the proposed model is validated through ablation studies and segmentation experiments; the proposed method DAFM-Net achieves competitive performance for nailfold microhemorrhage segmentation with an IOU score of 78.03% and Dice score of 87.34% compared to the ground truth.
ISSN:2076-3417
2076-3417
DOI:10.3390/app12105068