Deep Generative Model of Individual Variability in fMRI Images of Psychiatric Patients

Neuroimaging techniques, such as the resting-state functional magnetic resonance imaging (fMRI), have been investigated to find objective biomarkers of neuro-logical and psychiatric disorders. Objective biomarkers potentially provide a refined diagnosis and quantitative measurements of the effects o...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on biomedical engineering Vol. 68; no. 2; pp. 592 - 605
Main Authors Matsubara, Takashi, Kusano, Koki, Tashiro, Tetsuo, Ukai, Ken'ya, Uehara, Kuniaki
Format Journal Article
LanguageEnglish
Published United States IEEE 01.02.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Neuroimaging techniques, such as the resting-state functional magnetic resonance imaging (fMRI), have been investigated to find objective biomarkers of neuro-logical and psychiatric disorders. Objective biomarkers potentially provide a refined diagnosis and quantitative measurements of the effects of treatment. However, fMRI images are sensitive to individual variability, such as functional topography and personal attributes. Suppressing the irrelevant individual variability is crucial for finding objective biomarkers for multiple subjects. Herein, we propose a structured generative model based on deep learning (i.e., a deep generative model) that considers such individual variability. The proposed model builds a joint distribution of (preprocessed) fMRI images, state (with or without a disorder), and individual variability. It can thereby discriminate individual variability from the subject's state. Experimental results demonstrate that the proposed model can diagnose unknown subjects with greater accuracy than conventional approaches. Moreover, the diagnosis is fairer to gender and state, because the proposed model extracts subject attributes (age, gender, and scan site) in an unsupervised manner.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0018-9294
1558-2531
1558-2531
DOI:10.1109/TBME.2020.3008707