Consistent Multi-Atlas Hippocampus Segmentation for Longitudinal MR Brain Images with Temporal Sparse Representation

In this paper, we propose a novel multi-atlas based longitudinal label fusion method with temporal sparse representation technique to segment hippocampi at all time points simultaneously. First, we use groupwise longitudinal registration to simultaneously (1) estimate a group-mean image of a subject...

Full description

Saved in:
Bibliographic Details
Published inPatch-Based Techniques in Medical Imaging Vol. 9993; pp. 34 - 42
Main Authors Wang, Lin, Guo, Yanrong, Cao, Xiaohuan, Wu, Guorong, Shen, Dinggang
Format Book Chapter Journal Article
LanguageEnglish
Published Switzerland Springer International Publishing AG 01.01.2016
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319471174
3319471171
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-47118-1_5

Cover

Loading…
More Information
Summary:In this paper, we propose a novel multi-atlas based longitudinal label fusion method with temporal sparse representation technique to segment hippocampi at all time points simultaneously. First, we use groupwise longitudinal registration to simultaneously (1) estimate a group-mean image of a subject image sequence and (2) register its all time-point images to the estimated group-mean image consistently over time. Then, by registering all atlases with the group-mean image, we can align all atlases longitudinally consistently to each time point of the subject image sequence. Finally, we propose a longitudinal label fusion method to propagate all atlas labels to the subject image sequence by simultaneously labeling a set of temporally-corresponded voxels with a temporal consistency constraint on sparse representation. Experimental results demonstrate that our proposed method can achieve more accurate and consistent hippocampus segmentation than the state-of-the-art counterpart methods.
Bibliography:This work was supported in part by National Natural Science Foundation of China (No. 61503300) and China Postdoctoral Science Foundation (No. 2014M560801).
ISBN:9783319471174
3319471171
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-47118-1_5