A Novel Sparse Group Gaussian Graphical Model for Functional Connectivity Estimation
The estimation of intra-subject functional connectivity is greatly complicated by the small sample size and complex noise structure in functional magnetic resonance imaging (fMRI) data. Pooling samples across subjects improves the conditioning of the estimation, but loses subject-specific connectivi...
Saved in:
Published in | Information Processing in Medical Imaging Vol. 23; pp. 256 - 267 |
---|---|
Main Authors | , , , |
Format | Book Chapter Journal Article |
Language | English |
Published |
Berlin, Heidelberg
Springer Berlin Heidelberg
2013
|
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The estimation of intra-subject functional connectivity is greatly complicated by the small sample size and complex noise structure in functional magnetic resonance imaging (fMRI) data. Pooling samples across subjects improves the conditioning of the estimation, but loses subject-specific connectivity information. In this paper, we propose a new sparse group Gaussian graphical model (SGGGM) that facilitates joint estimation of intra-subject and group-level connectivity. This is achieved by casting functional connectivity estimation as a regularized consensus optimization problem, in which information across subjects is aggregated in learning group-level connectivity and group information is propagated back in estimating intra-subject connectivity. On synthetic data, we show that incorporating group information using SGGGM significantly enhances intra-subject connectivity estimation over existing techniques. More accurate group-level connectivity is also obtained. On real data from a cohort of 60 subjects, we show that integrating intra-subject connectivity estimated with SGGGM significantly improves brain activation detection over connectivity priors derived from other graphical modeling approaches. |
---|---|
ISBN: | 3642388671 9783642388675 |
ISSN: | 0302-9743 1011-2499 1611-3349 |
DOI: | 10.1007/978-3-642-38868-2_22 |