Label-Efficient 3D Brain Segmentation via Complementary 2D Diffusion Models with Orthogonal Views

Deep learning-based segmentation techniques have shown remarkable performance in brain segmentation, yet their success hinges on the availability of extensive labeled training data. Acquiring such vast datasets, however, poses a significant challenge in many clinical applications. To address this is...

Full description

Saved in:
Bibliographic Details
Main Authors Cho, Jihoon, Ahn, Suhyun, Kim, Beomju, Bae, Hyungjoon, Liu, Xiaofeng, Xing, Fangxu, Lee, Kyungeun, Elfakhri, Georges, Wedeen, Van, Woo, Jonghye, Park, Jinah
Format Journal Article
LanguageEnglish
Published 17.07.2024
Subjects
Online AccessGet full text
DOI10.48550/arxiv.2407.12329

Cover

Loading…
More Information
Summary:Deep learning-based segmentation techniques have shown remarkable performance in brain segmentation, yet their success hinges on the availability of extensive labeled training data. Acquiring such vast datasets, however, poses a significant challenge in many clinical applications. To address this issue, in this work, we propose a novel 3D brain segmentation approach using complementary 2D diffusion models. The core idea behind our approach is to first mine 2D features with semantic information extracted from the 2D diffusion models by taking orthogonal views as input, followed by fusing them into a 3D contextual feature representation. Then, we use these aggregated features to train multi-layer perceptrons to classify the segmentation labels. Our goal is to achieve reliable segmentation quality without requiring complete labels for each individual subject. Our experiments on training in brain subcortical structure segmentation with a dataset from only one subject demonstrate that our approach outperforms state-of-the-art self-supervised learning methods. Further experiments on the minimum requirement of annotation by sparse labeling yield promising results even with only nine slices and a labeled background region.
DOI:10.48550/arxiv.2407.12329