Unpaired Image Denoising Using a Generative Adversarial Network in X-Ray CT
This paper proposes a deep learning-based denoising method for noisy low-dose computerized tomography (CT) images in the absence of paired training data. The proposed method uses a fidelity-embedded generative adversarial network (GAN) to learn a denoising function from unpaired training data of low...
Saved in:
Published in | IEEE access Vol. 7; pp. 110414 - 110425 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper proposes a deep learning-based denoising method for noisy low-dose computerized tomography (CT) images in the absence of paired training data. The proposed method uses a fidelity-embedded generative adversarial network (GAN) to learn a denoising function from unpaired training data of low-dose CT (LDCT) and standard-dose CT (SDCT) images, where the denoising function is the optimal generator in the GAN framework. This paper analyzes the f-GAN objective to derive a suitable generator that is optimized by minimizing a weighted sum of two losses: the Kullback-Leibler divergence between an SDCT data distribution and a generated distribution, and the <inline-formula> <tex-math notation="LaTeX">\ell _{2} </tex-math></inline-formula> loss between the LDCT image and the corresponding generated images (or denoised image). The computed generator reflects the prior belief about SDCT data distribution through training. We observed that the proposed method allows the preservation of fine anomalous features while eliminating noise. The experimental results show that the proposed deep-learning method with unpaired datasets performs comparably to a method using paired datasets. A clinical experiment was also performed to show the validity of the proposed method for noise arising in the low-dose X-ray CT. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2934178 |