Full‐scale attention network for automated organ segmentation on head and neck CT and MR images
MRI and CT images have been routinely used in clinical practice for treatment planning of the head‐and‐neck (HAN) radiotherapy. Delineating organs‐at‐risk (OAR) is an essential step in radiotherapy, however, it is time‐consuming and prone to inter‐observer variation. The existing automatic segmentat...
Saved in:
Published in | IET image processing Vol. 17; no. 3; pp. 660 - 673 |
---|---|
Main Authors | , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Wiley
01.02.2023
|
Online Access | Get full text |
Cover
Loading…
Summary: | MRI and CT images have been routinely used in clinical practice for treatment planning of the head‐and‐neck (HAN) radiotherapy. Delineating organs‐at‐risk (OAR) is an essential step in radiotherapy, however, it is time‐consuming and prone to inter‐observer variation. The existing automatic segmentation approaches are either limited by image registration or lack of global spatial awareness, thus under‐performed when dealing with segmentation of complex anatomies. Herein, we propose a full‐scale attention network (FSANet) that integrates bi‐side skip connections, full‐scale feature fusion modules (FFM), a feature pyramid fusion and supervision module (FPFSM) to accurately and efficiently delineate OARs in HAN region on CT and MRI scans. Specifically, bi‐side skip connections were adopted to keep small targets in the deep network and to capture semantic features at different scales. The FFM with cascaded attention mechanisms were used to recalibrate the significant channels and salient regions in the feature maps. The FPFSM was used to guide the network to learn the hierarchical representation so as to improve the segmentation robustness. The proposed algorithm was validated on the public benchmark HAN CT dataset and an in‐house MR dataset. Both results show significant improvement compared to state‐of‐the‐art OAR single‐stage segmentation methods for the HAN region. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/ipr2.12663 |