High-Frequency Sensitive Generative Adversarial Network for Low-Dose CT Image Denoising
Low-dose computed tomography (LDCT) imaging has attracted tremendous attention because it reduces the potential cancer risk for patients by decreasing the radiation dose. However, reducing the radiation dose may cause image quality degradation due to the introduction of noise and artifacts. The deta...
Saved in:
Published in | IEEE access Vol. 8; pp. 930 - 943 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Low-dose computed tomography (LDCT) imaging has attracted tremendous attention because it reduces the potential cancer risk for patients by decreasing the radiation dose. However, reducing the radiation dose may cause image quality degradation due to the introduction of noise and artifacts. The details of pathological information mainly exist in the high-frequency domain of LDCT image. Therefore, some useful details may be lost or destroyed while removing the noise and artifacts. To address this problem, we propose a high-frequency sensitive generative adversarial network (HFSGAN). The new generator includes two sub-networks. One is the high-frequency domain U-Net, which is specially designed to deal with the high-frequency components decomposed from LDCT image. The other is image space U-Net, which is used to process information from the whole image of LDCT. In addition, the discriminator in HFSGAN adopts an inception module to increase the receptive field and width of network, and to extract the multi-scale features of the true and false images. The experiments show that the proposed network preserves more texture details of denoised image while removing noise and artifacts. Compared with the state-of-the-art networks, the proposed denoising method achieves better performance both quantitatively and visually. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2961983 |