Retrospective Motion Correction in Multishot MRI using Generative Adversarial Network

Multishot Magnetic Resonance Imaging (MRI) is a promising data acquisition technique that can produce a high-resolution image with relatively less data acquisition time than the standard spin echo. The downside of multishot MRI is that it is very sensitive to subject motion and even small levels of...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 10; no. 1; p. 4786
Main Authors Usman, Muhammad, Latif, Siddique, Asim, Muhammad, Lee, Byoung-Dai, Qadir, Junaid
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 16.03.2020
Nature Publishing Group
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Multishot Magnetic Resonance Imaging (MRI) is a promising data acquisition technique that can produce a high-resolution image with relatively less data acquisition time than the standard spin echo. The downside of multishot MRI is that it is very sensitive to subject motion and even small levels of motion during the scan can produce artifacts in the final magnetic resonance (MR) image, which may result in a misdiagnosis. Numerous efforts have focused on addressing this issue; however, all of these proposals are limited in terms of how much motion they can correct and require excessive computational time. In this paper, we propose a novel generative adversarial network (GAN)-based conjugate gradient SENSE (CG-SENSE) reconstruction framework for motion correction in multishot MRI. First CG-SENSE reconstruction is employed to reconstruct an image from the motion-corrupted k -space data and then the GAN-based proposed framework is applied to correct the motion artifacts. The proposed method has been rigorously evaluated on synthetically corrupted data on varying degrees of motion, numbers of shots, and encoding trajectories. Our analyses (both quantitative as well as qualitative/visual analysis) establish that the proposed method is robust and reduces several-fold the computational time reported by the current state-of-the-art technique.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-020-61705-9