Deep Learning-based Quantification of Abdominal Subcutaneous and Visceral Fat Volume on CT Images
Develop a deep learning-based algorithm using the U-Net architecture to measure abdominal fat on computed tomography (CT) images. Sequential CT images spanning the abdominal region of seven subjects were manually segmented to calculate subcutaneous fat (SAT) and visceral fat (VAT). The resulting seg...
Saved in:
Published in | Academic radiology |
---|---|
Main Authors | , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
01.11.2021
|
Subjects | |
Online Access | Get more information |
Cover
Loading…
Summary: | Develop a deep learning-based algorithm using the U-Net architecture to measure abdominal fat on computed tomography (CT) images.
Sequential CT images spanning the abdominal region of seven subjects were manually segmented to calculate subcutaneous fat (SAT) and visceral fat (VAT). The resulting segmentation maps of SAT and VAT were augmented using a template-based data augmentation approach to create a large dataset for neural network training. Neural network performance was evaluated on both sequential CT slices from three subjects and randomly selected CT images from the upper, central, and lower abdominal regions of 100 subjects.
Both subcutaneous and abdominal cavity segmentation images created by the two methods were highly comparable with an overall Dice similarity coefficient of 0.94. Pearson's correlation coefficients between the subcutaneous and visceral fat volumes quantified using the two methods were 0.99 and 0.99 and the overall percent residual squared error were 5.5% and 8.5%. Manual segmentation of SAT and VAT on the 555 CT slices used for testing took approximately 46 hours while automated segmentation took approximately 1 minute.
Our data demonstrates that deep learning methods utilizing a template-based data augmentation strategy can be employed to accurately and rapidly quantify total abdominal SAT and VAT with a small number of training images. |
---|---|
ISSN: | 1878-4046 |
DOI: | 10.1016/j.acra.2020.07.010 |