Deep learning for fully automated tumor segmentation and extraction of magnetic resonance radiomics features in cervical cancer

Objective To develop and evaluate the performance of U-Net for fully automated localization and segmentation of cervical tumors in magnetic resonance (MR) images and the robustness of extracting apparent diffusion coefficient (ADC) radiomics features. Methods This retrospective study involved analys...

Full description

Saved in:
Bibliographic Details
Published inEuropean radiology Vol. 30; no. 3; pp. 1297 - 1305
Main Authors Lin, Yu-Chun, Lin, Chia-Hung, Lu, Hsin-Ying, Chiang, Hsin-Ju, Wang, Ho-Kai, Huang, Yu-Ting, Ng, Shu-Hang, Hong, Ji-Hong, Yen, Tzu-Chen, Lai, Chyong-Huey, Lin, Gigin
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.03.2020
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Objective To develop and evaluate the performance of U-Net for fully automated localization and segmentation of cervical tumors in magnetic resonance (MR) images and the robustness of extracting apparent diffusion coefficient (ADC) radiomics features. Methods This retrospective study involved analysis of MR images from 169 patients with cervical cancer stage IB–IVA captured; among them, diffusion-weighted (DW) images from 144 patients were used for training, and another 25 patients were recruited for testing. A U-Net convolutional network was developed to perform automated tumor segmentation. The manually delineated tumor region was used as the ground truth for comparison. Segmentation performance was assessed for various combinations of input sources for training. ADC radiomics were extracted and assessed using Pearson correlation. The reproducibility of the training was also assessed. Results Combining b0, b1000, and ADC images as a triple-channel input exhibited the highest learning efficacy in the training phase and had the highest accuracy in the testing dataset, with a dice coefficient of 0.82, sensitivity 0.89, and a positive predicted value 0.92. The first-order ADC radiomics parameters were significantly correlated between the manually contoured and fully automated segmentation methods ( p  < 0.05). Reproducibility between the first and second training iterations was high for the first-order radiomics parameters (intraclass correlation coefficient = 0.70–0.99). Conclusion U-Net-based deep learning can perform accurate localization and segmentation of cervical cancer in DW MR images. First-order radiomics features extracted from whole tumor volume demonstrate the potential robustness for longitudinal monitoring of tumor responses in broad clinical settings. Summary U-Net-based deep learning can perform accurate localization and segmentation of cervical cancer in DW MR images. Key Points • U-Net-based deep learning can perform accurate fully automated localization and segmentation of cervical cancer in diffusion-weighted MR images. • Combining b0, b1000, and apparent diffusion coefficient (ADC) images exhibited the highest accuracy in fully automated localization. • First-order radiomics feature extraction from whole tumor volume was robust and could thus potentially be used for longitudinal monitoring of treatment responses.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0938-7994
1432-1084
1432-1084
DOI:10.1007/s00330-019-06467-3