Magnetic resonance image (MRI) synthesis from brain computed tomography (CT) images based on deep learning methods for magnetic resonance (MR)-guided radiotherapy

Precise patient setup is critical in radiation therapy. Medical imaging plays an essential role in patient setup. As compared to computed tomography (CT) images, magnetic resonance image (MRI) has high contrast for soft tissues, which becomes a promising imaging modality during treatment. In this pa...

Full description

Saved in:
Bibliographic Details
Published inQuantitative imaging in medicine and surgery Vol. 10; no. 6; pp. 1223 - 1236
Main Authors Li, Wen, Li, Yafen, Qin, Wenjian, Liang, Xiaokun, Xu, Jianyang, Xiong, Jing, Xie, Yaoqin
Format Journal Article
LanguageEnglish
Published China AME Publishing Company 01.06.2020
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Precise patient setup is critical in radiation therapy. Medical imaging plays an essential role in patient setup. As compared to computed tomography (CT) images, magnetic resonance image (MRI) has high contrast for soft tissues, which becomes a promising imaging modality during treatment. In this paper, we proposed a method to synthesize brain MRI images from corresponding planning CT (pCT) images. The synthetic MRI (sMRI) images can be used to align with positioning MRI (pMRI) equipped by an MRI-guided accelerator to account for the disadvantages of multi-modality image registration. Several deep learning network models were applied to implement this brain MRI synthesis task, including CycleGAN, Pix2Pix model, and U-Net. We evaluated these methods using several metrics, including mean absolute error (MAE), mean squared error (MSE), structural similarity index (SSIM), and peak signal-to-noise ratio (PSNR). In our experiments, U-Net with L1+L2 loss achieved the best results with the lowest overall average MAE of 74.19 and MSE of 1.035*10 , respectively, and produced the highest SSIM of 0.9440 and PSNR of 32.44. Quantitative comparisons suggest that the performance of U-Net, a supervised deep learning method, is better than the performance of CycleGAN, a typical unsupervised method, in our brain MRI synthesis procedure. The proposed method can convert pCT/pMRI multi-modality registration into mono-modality registration, which can be used to reduce registration error and achieve a more accurate patient setup.
AbstractList Precise patient setup is critical in radiation therapy. Medical imaging plays an essential role in patient setup. As compared to computed tomography (CT) images, magnetic resonance image (MRI) has high contrast for soft tissues, which becomes a promising imaging modality during treatment. In this paper, we proposed a method to synthesize brain MRI images from corresponding planning CT (pCT) images. The synthetic MRI (sMRI) images can be used to align with positioning MRI (pMRI) equipped by an MRI-guided accelerator to account for the disadvantages of multi-modality image registration.BACKGROUNDPrecise patient setup is critical in radiation therapy. Medical imaging plays an essential role in patient setup. As compared to computed tomography (CT) images, magnetic resonance image (MRI) has high contrast for soft tissues, which becomes a promising imaging modality during treatment. In this paper, we proposed a method to synthesize brain MRI images from corresponding planning CT (pCT) images. The synthetic MRI (sMRI) images can be used to align with positioning MRI (pMRI) equipped by an MRI-guided accelerator to account for the disadvantages of multi-modality image registration.Several deep learning network models were applied to implement this brain MRI synthesis task, including CycleGAN, Pix2Pix model, and U-Net. We evaluated these methods using several metrics, including mean absolute error (MAE), mean squared error (MSE), structural similarity index (SSIM), and peak signal-to-noise ratio (PSNR).METHODSSeveral deep learning network models were applied to implement this brain MRI synthesis task, including CycleGAN, Pix2Pix model, and U-Net. We evaluated these methods using several metrics, including mean absolute error (MAE), mean squared error (MSE), structural similarity index (SSIM), and peak signal-to-noise ratio (PSNR).In our experiments, U-Net with L1+L2 loss achieved the best results with the lowest overall average MAE of 74.19 and MSE of 1.035*104, respectively, and produced the highest SSIM of 0.9440 and PSNR of 32.44.RESULTSIn our experiments, U-Net with L1+L2 loss achieved the best results with the lowest overall average MAE of 74.19 and MSE of 1.035*104, respectively, and produced the highest SSIM of 0.9440 and PSNR of 32.44.Quantitative comparisons suggest that the performance of U-Net, a supervised deep learning method, is better than the performance of CycleGAN, a typical unsupervised method, in our brain MRI synthesis procedure. The proposed method can convert pCT/pMRI multi-modality registration into mono-modality registration, which can be used to reduce registration error and achieve a more accurate patient setup.CONCLUSIONSQuantitative comparisons suggest that the performance of U-Net, a supervised deep learning method, is better than the performance of CycleGAN, a typical unsupervised method, in our brain MRI synthesis procedure. The proposed method can convert pCT/pMRI multi-modality registration into mono-modality registration, which can be used to reduce registration error and achieve a more accurate patient setup.
Precise patient setup is critical in radiation therapy. Medical imaging plays an essential role in patient setup. As compared to computed tomography (CT) images, magnetic resonance image (MRI) has high contrast for soft tissues, which becomes a promising imaging modality during treatment. In this paper, we proposed a method to synthesize brain MRI images from corresponding planning CT (pCT) images. The synthetic MRI (sMRI) images can be used to align with positioning MRI (pMRI) equipped by an MRI-guided accelerator to account for the disadvantages of multi-modality image registration. Several deep learning network models were applied to implement this brain MRI synthesis task, including CycleGAN, Pix2Pix model, and U-Net. We evaluated these methods using several metrics, including mean absolute error (MAE), mean squared error (MSE), structural similarity index (SSIM), and peak signal-to-noise ratio (PSNR). In our experiments, U-Net with L1+L2 loss achieved the best results with the lowest overall average MAE of 74.19 and MSE of 1.035*10 , respectively, and produced the highest SSIM of 0.9440 and PSNR of 32.44. Quantitative comparisons suggest that the performance of U-Net, a supervised deep learning method, is better than the performance of CycleGAN, a typical unsupervised method, in our brain MRI synthesis procedure. The proposed method can convert pCT/pMRI multi-modality registration into mono-modality registration, which can be used to reduce registration error and achieve a more accurate patient setup.
Author Qin, Wenjian
Xiong, Jing
Xie, Yaoqin
Li, Yafen
Xu, Jianyang
Li, Wen
Liang, Xiaokun
Author_xml – sequence: 1
  givenname: Wen
  surname: Li
  fullname: Li, Wen
– sequence: 2
  givenname: Yafen
  surname: Li
  fullname: Li, Yafen
– sequence: 3
  givenname: Wenjian
  surname: Qin
  fullname: Qin, Wenjian
– sequence: 4
  givenname: Xiaokun
  surname: Liang
  fullname: Liang, Xiaokun
– sequence: 5
  givenname: Jianyang
  surname: Xu
  fullname: Xu, Jianyang
– sequence: 6
  givenname: Jing
  surname: Xiong
  fullname: Xiong, Jing
– sequence: 7
  givenname: Yaoqin
  surname: Xie
  fullname: Xie, Yaoqin
BackLink https://www.ncbi.nlm.nih.gov/pubmed/32550132$$D View this record in MEDLINE/PubMed
BookMark eNptkc1q3DAUhUVJadI0u66LljMQJ_qzJW8KYWjaQEKgpGtxbcseFVtyJLkwr9MnrdJJQptWG0nc75wjcd6iA-edQeg9JWeMEi7P7-0UC1oXSpWv0BFjjBeCk-rg6cxqdohOYvxO8pKKSkreoEPOypJQzo7QzxsYnEm2xcFE78C1BtsJBoNXN1-v1jjuXNqaaCPug59wE8A63PppXpLpcPKTHwLM2x1ebe7We2XEDcQ89A53xsx4NBCcdQOeTNr6Ljv5gKd_Y3PguhgW22VtgM76HJy9d-_Q6x7GaE4e92P07fLT3eZLcX37-WpzcV20XIlUAPBaKmBQN6yTUBGhBKOVbKQijIt8qxURZcdLCbUSihFCGy4psL6rRG_4Mfq4952XZjJda1wKMOo55F-FnfZg9d8TZ7d68D-0ZLLipcoGq0eD4O8XE5OebGzNOIIzfomaCSokF7wqM_rhz6znkKdiMnC6B9rgYwymf0Yo0b-r1w_Va1rrXH3G2Qu8tQmS9Q8vteP_Rb8AxYm0GQ
CitedBy_id crossref_primary_10_1155_2022_4242069
crossref_primary_10_1109_LSP_2022_3217411
crossref_primary_10_1259_bjr_20220059
crossref_primary_10_1016_j_mric_2021_06_008
crossref_primary_10_1088_2057_1976_abe3a7
crossref_primary_10_1007_s00330_023_10534_1
crossref_primary_10_1002_acm2_13579
crossref_primary_10_1016_j_jksuci_2023_101821
crossref_primary_10_1016_j_media_2024_103388
crossref_primary_10_1038_s41598_024_68705_z
crossref_primary_10_1016_j_media_2024_103184
crossref_primary_10_1002_ima_70013
crossref_primary_10_1002_acm2_14155
crossref_primary_10_3390_diagnostics12061489
crossref_primary_10_3389_fradi_2021_664444
crossref_primary_10_1016_j_knosys_2023_110802
crossref_primary_10_3389_fonc_2021_665807
crossref_primary_10_1155_2020_5615371
crossref_primary_10_1007_s40123_022_00548_1
crossref_primary_10_1088_1361_6560_ac6725
crossref_primary_10_3390_s22114043
crossref_primary_10_1097_RMR_0000000000000279
crossref_primary_10_1016_j_ejrad_2024_111666
crossref_primary_10_3389_fonc_2022_975902
crossref_primary_10_1109_JBHI_2023_3308529
crossref_primary_10_1007_s12024_025_00943_7
crossref_primary_10_1016_j_ijrobp_2021_11_007
crossref_primary_10_1186_s12967_023_04222_3
crossref_primary_10_3390_app12083790
crossref_primary_10_1002_mp_16256
crossref_primary_10_1016_j_radonc_2020_09_008
crossref_primary_10_3389_fradi_2024_1385742
crossref_primary_10_1007_s13246_023_01229_4
crossref_primary_10_1002_mp_16847
crossref_primary_10_1002_acm2_14494
crossref_primary_10_1016_j_jrras_2022_03_009
crossref_primary_10_3390_cancers15143565
crossref_primary_10_1016_j_cmpb_2020_105817
crossref_primary_10_3390_diagnostics13162650
crossref_primary_10_1016_j_neurad_2023_01_157
crossref_primary_10_7555_JBR_36_20220037
crossref_primary_10_1016_j_heliyon_2023_e22647
crossref_primary_10_3390_biomedinformatics3030050
crossref_primary_10_3390_bioengineering10091078
crossref_primary_10_37015_AUDT_2023_230011
crossref_primary_10_1167_tvst_13_9_26
crossref_primary_10_1088_1361_6560_acba74
crossref_primary_10_3390_cancers15245768
crossref_primary_10_1016_j_eswa_2022_117421
crossref_primary_10_1371_journal_pcbi_1012490
crossref_primary_10_1002_pro6_1143
crossref_primary_10_1016_j_cmpb_2023_107571
crossref_primary_10_3174_ajnr_A7588
crossref_primary_10_1088_1361_6560_ac4123
crossref_primary_10_1109_JBHI_2024_3393870
crossref_primary_10_1109_TRPMS_2024_3379580
crossref_primary_10_1148_radiol_230681
crossref_primary_10_1109_JBHI_2022_3205961
crossref_primary_10_1016_j_compmedimag_2023_102227
crossref_primary_10_1152_japplphysiol_00747_2022
crossref_primary_10_1007_s10334_023_01145_4
crossref_primary_10_1016_j_ejmp_2021_07_027
ContentType Journal Article
Copyright 2020 Quantitative Imaging in Medicine and Surgery. All rights reserved.
2020 Quantitative Imaging in Medicine and Surgery. All rights reserved. 2020 Quantitative Imaging in Medicine and Surgery.
Copyright_xml – notice: 2020 Quantitative Imaging in Medicine and Surgery. All rights reserved.
– notice: 2020 Quantitative Imaging in Medicine and Surgery. All rights reserved. 2020 Quantitative Imaging in Medicine and Surgery.
DBID AAYXX
CITATION
NPM
7X8
5PM
DOI 10.21037/qims-19-885
DatabaseName CrossRef
PubMed
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
PubMed
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
PubMed
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
EISSN 2223-4306
EndPage 1236
ExternalDocumentID PMC7276358
32550132
10_21037_qims_19_885
Genre Journal Article
GroupedDBID 53G
AAKDD
AAYXX
ALMA_UNASSIGNED_HOLDINGS
CITATION
DIK
HYE
OK1
RPM
NPM
7X8
5PM
ID FETCH-LOGICAL-c384t-aa3978a2a9b2d7a604842167b78023448498045d357a98482001b371a2fd64fe3
ISSN 2223-4292
IngestDate Thu Aug 21 14:02:58 EDT 2025
Thu Jul 10 19:15:33 EDT 2025
Thu Apr 03 07:01:01 EDT 2025
Tue Jul 01 02:30:31 EDT 2025
Thu Apr 24 23:00:02 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed false
IsScholarly true
Issue 6
Keywords radiotherapy
Computed tomography (CT)
patient setup
image synthesis
magnetic resonance image (MRI)
Language English
License 2020 Quantitative Imaging in Medicine and Surgery. All rights reserved.
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c384t-aa3978a2a9b2d7a604842167b78023448498045d357a98482001b371a2fd64fe3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
These authors contributed equally to this work.
OpenAccessLink https://qims.amegroups.com/article/viewFile/41784/pdf
PMID 32550132
PQID 2414734365
PQPubID 23479
PageCount 14
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_7276358
proquest_miscellaneous_2414734365
pubmed_primary_32550132
crossref_primary_10_21037_qims_19_885
crossref_citationtrail_10_21037_qims_19_885
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2020-6-00
2020-Jun
20200601
PublicationDateYYYYMMDD 2020-06-01
PublicationDate_xml – month: 06
  year: 2020
  text: 2020-6-00
PublicationDecade 2020
PublicationPlace China
PublicationPlace_xml – name: China
PublicationTitle Quantitative imaging in medicine and surgery
PublicationTitleAlternate Quant Imaging Med Surg
PublicationYear 2020
Publisher AME Publishing Company
Publisher_xml – name: AME Publishing Company
SSID ssj0000781710
Score 2.4535565
Snippet Precise patient setup is critical in radiation therapy. Medical imaging plays an essential role in patient setup. As compared to computed tomography (CT)...
SourceID pubmedcentral
proquest
pubmed
crossref
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
StartPage 1223
SubjectTerms Original
Title Magnetic resonance image (MRI) synthesis from brain computed tomography (CT) images based on deep learning methods for magnetic resonance (MR)-guided radiotherapy
URI https://www.ncbi.nlm.nih.gov/pubmed/32550132
https://www.proquest.com/docview/2414734365
https://pubmed.ncbi.nlm.nih.gov/PMC7276358
Volume 10
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lj9MwELbKIiEuiDflJSOBtFUVSBwnTo5otasFtYsWpaKcIqdJSoCmy6Y5wG_hxC9lxnbSZAvSwiVqXccT6ftqz2RehDy3QybZIk0sG64WF1loSSfhlp3mycJOA5nnmCg8PfGPZ_zt3JsPBj87UUv1Jnm5-PHHvJL_QRXGAFfMkv0HZNtFYQA-A75wBYTheimMp3JZZroIM2rU-B8tVhiEA2rj9P0btPir7yWoeFh1RCWSJNgRQsWR16hqbtYrU7JaeXMjvEOtUI3xeEvRlZBm2VnTXGJpOk6rIg7j1a54JRhWsZZ1kcL95zItTI5Xz398WstSpbdh4BJKNKk1jatf-TSqTso2xgypwIMP29w1PfBR5tuhU10SASZ97vB-Upi34vNCrr_UZfdVB7O3IVl6R0RdxsL-Wr3t2-7QtLsXO0xnMl88JJipM_CtWFWWE1qB7hnUr8V98i4-mk0mcXQ4j66QqwyMENZ5F6TOeRE4QpW7aJ9LZ1YoAa86y_d1nh1D5mI8bkfBiW6SG8Yyoa81zW6RQVbeJtemBpA75FfDNtrCTRVX6D5wbURbplFkGlVMow3T6JZpdP8gGuk7K6pYRtclRZbRhmXUsIwCy-hqVywIHBmG0S7D7pLZ0WF0cGyZDh_Wwg34xpIS1OFAMhkmLBXSh-OEM8cXicC6hBy-hQHYHKnrCRkGPMAAwMQVjmR56vM8c--RvXJdZg8IRWNBMp7AfCyolEvP9QPuMg8Gck-EQzJuMIgXpvw9dmH5GoMZrBCLEbHYCWNAbEhetLPPdNmXv8x71sAZw76MzjZZZuu6ikEz5sLlrg9z7mt425XgqTz0cQ6J6AHfTsCa7_1fyuKTqv0O5gaYCMHDS8h9RK5v_0SPyd7mvM6egAa9SZ4qIv8GVxPNhg
linkProvider National Library of Medicine
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Magnetic+resonance+image+%28MRI%29+synthesis+from+brain+computed+tomography+%28CT%29+images+based+on+deep+learning+methods+for+magnetic+resonance+%28MR%29-guided+radiotherapy&rft.jtitle=Quantitative+imaging+in+medicine+and+surgery&rft.au=Li%2C+Wen&rft.au=Li%2C+Yafen&rft.au=Qin%2C+Wenjian&rft.au=Liang%2C+Xiaokun&rft.date=2020-06-01&rft.issn=2223-4292&rft.volume=10&rft.issue=6&rft.spage=1223&rft_id=info:doi/10.21037%2Fqims-19-885&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2223-4292&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2223-4292&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2223-4292&client=summon