Gated Channel Attention Network for Cataract Classification on AS-OCT Image

Nuclear cataract (NC) is the leading cause of blindness and vision impairment globally. Accurate NC classification is significant for clinical NC diagnosis. Anterior segment optical coherence tomography (AS-OCT) is a non-contact, high-resolution, objective imaging technique, which is widely used in...

Full description

Saved in:
Bibliographic Details
Published inNeural Information Processing Vol. 13110; pp. 357 - 368
Main Authors Xiao, Zunjie, Zhang, Xiaoqing, Higashita, Risa, Hu, Yan, Yuan, Jin, Chen, Wan, Liu, Jiang
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2021
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783030922375
3030922375
ISSN0302-9743
1611-3349
DOI10.1007/978-3-030-92238-2_30

Cover

Loading…
More Information
Summary:Nuclear cataract (NC) is the leading cause of blindness and vision impairment globally. Accurate NC classification is significant for clinical NC diagnosis. Anterior segment optical coherence tomography (AS-OCT) is a non-contact, high-resolution, objective imaging technique, which is widely used in diagnosing ophthalmic diseases. Clinical studies have shown that there is a significant correlation between the pixel density of the lens region on AS-OCT images and NC severity levels; however, automatic NC classification on AS-OCT images has not been seriously studied. Motivated by clinical research, this paper proposes a gated channel attention network (GCA-Net) to classify NC severity levels automatically. In the GCA-Net, we design a gated channel attention block by fusing the clinical priority knowledge, in which a gated layer is designed to filter out abundant features and a Softmax layer is used to build the weakly interacting for channels. We use a clinical AS-OCT image dataset to demonstrate the effectiveness of our GCA-Net. The results showed that the proposed GCA-Net achieves 94.3% in accuracy and outperformed strong baselines and state-of-the-art attention-based networks.
Bibliography:Z. Xiao and X. Zhang—Equal contribution.
ISBN:9783030922375
3030922375
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-030-92238-2_30