A joint autoencoder and classifier deep neural network for AD and MCI classification
In this article, we present a new approach to distinguish progressive mild cognitively impaired (pMCI) subjects, who eventually develop Alzheimer's disease (AD) from stable MCI (sMCI) subjects whose situation does not deteriorate into AD. The proposed approach combines the discriminating capabi...
Saved in:
Published in | International journal of imaging systems and technology Vol. 34; no. 2 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Hoboken, USA
John Wiley & Sons, Inc
01.03.2024
Wiley Subscription Services, Inc |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this article, we present a new approach to distinguish progressive mild cognitively impaired (pMCI) subjects, who eventually develop Alzheimer's disease (AD) from stable MCI (sMCI) subjects whose situation does not deteriorate into AD. The proposed approach combines the discriminating capabilities of classifiers and representation learning capacities of autoencoders into a unified architecture, and is hence termed as joint autoencoder and classifier deep neural network (JACDNN). JACDNN employs a single classifier and multiple autoencoders that are trained together to perform pattern classification. The classifier in JACDNN is trained using standard approaches to distinguish between subject from different classes using the binary cross entropy loss. The autoencoders in JACDNN, regularizes individual layers in the network used for classification to learn representations useful for reconstructing a given input. The performance of JACDNN has been evaluated on several machine learning problems pertaining to dementia, namely AD versus cognitively normal (CN) subjects, AD versus sMCI, CN versus pMCI, and pMCI versus sMCI. These problems are targeted using two datasets. The first dataset consist of gray matter (GM) features of subjects and the second dataset consist of combination of GM and white matter (WM) features. It is observed that better classification results are obtained when the classifier is built on GM and WM as compared with GM features alone. Performance comparison of JACDNN with other existing approaches has been conducted for these problems. The results clearly indicate that JACDNN performs better than other existing approaches for these problems. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0899-9457 1098-1098 |
DOI: | 10.1002/ima.23054 |