Deep Predictive Motion Tracking in Magnetic Resonance Imaging: Application to Fetal Imaging
Fetal magnetic resonance imaging (MRI) is challenged by uncontrollable, large, and irregular fetal movements. It is, therefore, performed through visual monitoring of fetal motion and repeated acquisitions to ensure diagnostic-quality images are acquired. Nevertheless, visual monitoring of fetal mot...
Saved in:
Published in | IEEE transactions on medical imaging Vol. 39; no. 11; pp. 3523 - 3534 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.11.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 0278-0062 1558-254X 1558-254X |
DOI | 10.1109/TMI.2020.2998600 |
Cover
Abstract | Fetal magnetic resonance imaging (MRI) is challenged by uncontrollable, large, and irregular fetal movements. It is, therefore, performed through visual monitoring of fetal motion and repeated acquisitions to ensure diagnostic-quality images are acquired. Nevertheless, visual monitoring of fetal motion based on displayed slices, and navigation at the level of stacks-of-slices is inefficient. The current process is highly operator-dependent, increases scanner usage and cost, and significantly increases the length of fetal MRI scans which makes them hard to tolerate for pregnant women. To help build automatic MRI motion tracking and navigation systems to overcome the limitations of the current process and improve fetal imaging, we have developed a new real-time image-based motion tracking method based on deep learning that learns to predict fetal motion directly from acquired images. Our method is based on a recurrent neural network, composed of spatial and temporal encoder-decoders, that infers motion parameters from anatomical features extracted from sequences of acquired slices. We compared our trained network on held-out test sets (including data with different characteristics, e.g. different fetuses scanned at different ages, and motion trajectories recorded from volunteer subjects) with networks designed for estimation as well as methods adopted to make predictions. The results show that our method outperformed alternative techniques, and achieved real-time performance with average errors of 3.5 and 8 degrees for the estimation and prediction tasks, respectively. Our real-time deep predictive motion tracking technique can be used to assess fetal movements, to guide slice acquisitions, and to build navigation systems for fetal MRI. |
---|---|
AbstractList | Fetal magnetic resonance imaging (MRI) is challenged by uncontrollable, large, and irregular fetal movements. It is, therefore, performed through visual monitoring of fetal motion and repeated acquisitions to ensure diagnostic-quality images are acquired. Nevertheless, visual monitoring of fetal motion based on displayed slices, and navigation at the level of stacks-of-slices is inefficient. The current process is highly operator-dependent, increases scanner usage and cost, and significantly increases the length of fetal MRI scans which makes them hard to tolerate for pregnant women. To help build automatic MRI motion tracking and navigation systems to overcome the limitations of the current process and improve fetal imaging, we have developed a new real-time image-based motion tracking method based on deep learning that learns to predict fetal motion directly from acquired images. Our method is based on a recurrent neural network, composed of spatial and temporal encoder-decoders, that infers motion parameters from anatomical features extracted from sequences of acquired slices. We compared our trained network on held-out test sets (including data with different characteristics, e.g. different fetuses scanned at different ages, and motion trajectories recorded from volunteer subjects) with networks designed for estimation as well as methods adopted to make predictions. The results show that our method outperformed alternative techniques, and achieved real-time performance with average errors of 3.5 and 8 degrees for the estimation and prediction tasks, respectively. Our real-time deep predictive motion tracking technique can be used to assess fetal movements, to guide slice acquisitions, and to build navigation systems for fetal MRI. Fetal magnetic resonance imaging (MRI) is challenged by uncontrollable, large, and irregular fetal movements. It is, therefore, performed through visual monitoring of fetal motion and repeated acquisitions to ensure diagnostic-quality images are acquired. Nevertheless, visual monitoring of fetal motion based on displayed slices, and navigation at the level of stacks-of-slices is inefficient. The current process is highly operator-dependent, increases scanner usage and cost, and significantly increases the length of fetal MRI scans which makes them hard to tolerate for pregnant women. To help build automatic MRI motion tracking and navigation systems to overcome the limitations of the current process and improve fetal imaging, we have developed a new real-time image-based motion tracking method based on deep learning that learns to predict fetal motion directly from acquired images. Our method is based on a recurrent neural network, composed of spatial and temporal encoder-decoders, that infers motion parameters from anatomical features extracted from sequences of acquired slices. We compared our trained network on held-out test sets (including data with different characteristics, e.g. different fetuses scanned at different ages, and motion trajectories recorded from volunteer subjects) with networks designed for estimation as well as methods adopted to make predictions. The results show that our method outperformed alternative techniques, and achieved real-time performance with average errors of 3.5 and 8 degrees for the estimation and prediction tasks, respectively. Our real-time deep predictive motion tracking technique can be used to assess fetal movements, to guide slice acquisitions, and to build navigation systems for fetal MRI.Fetal magnetic resonance imaging (MRI) is challenged by uncontrollable, large, and irregular fetal movements. It is, therefore, performed through visual monitoring of fetal motion and repeated acquisitions to ensure diagnostic-quality images are acquired. Nevertheless, visual monitoring of fetal motion based on displayed slices, and navigation at the level of stacks-of-slices is inefficient. The current process is highly operator-dependent, increases scanner usage and cost, and significantly increases the length of fetal MRI scans which makes them hard to tolerate for pregnant women. To help build automatic MRI motion tracking and navigation systems to overcome the limitations of the current process and improve fetal imaging, we have developed a new real-time image-based motion tracking method based on deep learning that learns to predict fetal motion directly from acquired images. Our method is based on a recurrent neural network, composed of spatial and temporal encoder-decoders, that infers motion parameters from anatomical features extracted from sequences of acquired slices. We compared our trained network on held-out test sets (including data with different characteristics, e.g. different fetuses scanned at different ages, and motion trajectories recorded from volunteer subjects) with networks designed for estimation as well as methods adopted to make predictions. The results show that our method outperformed alternative techniques, and achieved real-time performance with average errors of 3.5 and 8 degrees for the estimation and prediction tasks, respectively. Our real-time deep predictive motion tracking technique can be used to assess fetal movements, to guide slice acquisitions, and to build navigation systems for fetal MRI. |
Author | Singh, Ayush Salehi, Seyed Sadegh Mohseni Gholipour, Ali |
Author_xml | – sequence: 1 givenname: Ayush orcidid: 0000-0002-3795-5623 surname: Singh fullname: Singh, Ayush email: ayush.singh@childrens.harvard.edu organization: Department of Radiology, Boston Children's Hospital, Boston, MA, USA – sequence: 2 givenname: Seyed Sadegh Mohseni orcidid: 0000-0001-6085-3580 surname: Salehi fullname: Salehi, Seyed Sadegh Mohseni email: sadegh.msalehi@gmail.com organization: Hyperfine Research Inc., Guilford, CT, USA – sequence: 3 givenname: Ali orcidid: 0000-0001-7699-4564 surname: Gholipour fullname: Gholipour, Ali email: ali.gholipour@childrens.harvard.edu organization: Department of Radiology, Boston Children's Hospital, Boston, MA, USA |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/32746102$$D View this record in MEDLINE/PubMed |
BookMark | eNp9Uc9rFDEUDlKx29W7IMiAFy-zJi-_Jh4KpVpd6KLICoKHkM28rqmzyTiTLfjfm3a3i_bgJe_w_cj73ndCjmKKSMhzRmeMUfNmuZjPgAKdgTGNovQRmTApmxqk-HZEJhR0U1Oq4JicjOM1pUxIap6QYw5aKEZhQr6_Q-yrzwO2wedwg9Ui5ZBitRyc_xniugqxWrh1xBx89QXHFF30WM03bl3Qt9VZ33fBuztNTtUFZtfdo0_J4yvXjfhsP6fk68X75fnH-vLTh_n52WXtBYVcty2C0oxpwVqvG1CgZCMcd7LRnkmEpuW8MWC8c05yAKE8YyuzMqi5Acan5HTn229XG2w9xjy4zvZD2Ljht00u2H-RGH7YdbqxWjeaGVEMXu8NhvRri2O2mzB67DoXMW1HC4JTrhWUd0pePaBep-0QS7zCkkqCVkIW1su_Nzqscn_3QqA7gh_SOA54daAwam-rtaVae1ut3VdbJOqBxId8d_iSKXT_E77YCQMiHv4xjHJVgv0B0Nyu2Q |
CODEN | ITMID4 |
CitedBy_id | crossref_primary_10_1109_TMI_2022_3217725 crossref_primary_10_1016_j_mric_2021_06_007 crossref_primary_10_12998_wjcc_v11_i16_3725 crossref_primary_10_1016_j_bspc_2022_104484 crossref_primary_10_1109_TIP_2023_3333195 crossref_primary_10_1002_nbm_5248 crossref_primary_10_1186_s41747_023_00358_5 crossref_primary_10_1259_bjr_20211205 crossref_primary_10_1109_TMI_2022_3208277 crossref_primary_10_1002_jmri_27759 crossref_primary_10_1109_TBME_2023_3243436 crossref_primary_10_1002_mrm_29803 crossref_primary_10_1002_nano_202200219 crossref_primary_10_1002_mrm_29106 crossref_primary_10_1007_s10334_024_01173_8 crossref_primary_10_3390_biomedicines12122929 crossref_primary_10_1002_jmri_27794 crossref_primary_10_1002_uog_29109 crossref_primary_10_1016_j_neuroimage_2024_120603 crossref_primary_10_3390_diagnostics13142355 |
Cites_doi | 10.1002/(SICI)1522-2594(199911)42:5<963::AID-MRM17>3.0.CO;2-L 10.1007/978-3-030-00928-1_36 10.1038/323533a0 10.1007/978-3-319-46466-4_22 10.1109/ICRA.2017.7989233 10.1162/neco.1997.9.8.1735 10.1109/CVPR.2015.7298758 10.1016/j.neuroimage.2017.04.004 10.1109/TMI.2017.2737081 10.1109/MITS.2014.2357038 10.1109/TMI.2015.2415453 10.1109/CVPR.2018.00762 10.1109/TMI.2020.2974844 10.1109/TMI.2017.2721362 10.1109/TMI.2010.2051680 10.1002/mrm.24314 10.1007/978-3-319-46448-0_45 10.1109/CVPR.2018.00935 10.1109/LRA.2018.2792152 10.1109/IEMBS.2011.6091385 10.1002/mrm.22176 10.1109/CVPR.2017.497 10.1002/mrm.27934 10.1007/978-3-319-46484-8_29 10.3174/ajnr.A5694 10.1016/j.neuroimage.2006.01.015 10.1109/TMI.2018.2798801 10.1016/j.media.2017.04.010 10.1002/cmr.a.21321 10.1109/ISBI.2015.7163836 10.1109/TMI.2018.2866442 10.1007/s10237-015-0738-1 10.1016/j.media.2012.07.004 10.1109/TMI.2016.2555244 10.1097/RMR.0000000000000219 10.1002/1522-2594(200009)44:3<457::AID-MRM17>3.0.CO;2-R 10.1016/j.acra.2006.05.003 10.1109/ISCAS.2017.8050867 10.1109/ISBI.2018.8363675 10.1109/CVPR.2018.00542 10.1016/j.neuroimage.2019.116324 10.3174/ajnr.A3128 10.1109/ICCP.2009.5284727 10.1109/ICCV.2015.308 10.1016/j.neuroimage.2017.04.033 10.1109/CVPR.2017.531 10.1109/IVS.2012.6232277 10.1007/978-3-319-66185-8_34 10.1038/s41598-017-00525-w 10.1109/ITSC.2017.8317904 10.1002/mrm.27381 10.1007/3-540-46805-6_19 10.1002/mrm.27705 10.1007/s00247-016-3677-9 10.1016/j.neubiorev.2018.06.001 10.1109/TMI.2007.895456 10.1109/TMI.2009.2030679 10.1002/mrm.27613 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020 |
DBID | 97E RIA RIE AAYXX CITATION NPM 7QF 7QO 7QQ 7SC 7SE 7SP 7SR 7TA 7TB 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D NAPCQ P64 7X8 5PM |
DOI | 10.1109/TMI.2020.2998600 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Ceramic Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Nursing & Allied Health Premium Biotechnology and BioEngineering Abstracts MEDLINE - Academic PubMed Central (Full Participant titles) |
DatabaseTitle | CrossRef PubMed Materials Research Database Civil Engineering Abstracts Aluminium Industry Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Ceramic Abstracts Materials Business File METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Aerospace Database Nursing & Allied Health Premium Engineered Materials Abstracts Biotechnology Research Abstracts Solid State and Superconductivity Abstracts Engineering Research Database Corrosion Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic Materials Research Database PubMed |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE/IET Electronic Library url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Medicine Engineering |
EISSN | 1558-254X |
EndPage | 3534 |
ExternalDocumentID | PMC7787194 32746102 10_1109_TMI_2020_2998600 9103624 |
Genre | orig-research Research Support, Non-U.S. Gov't Journal Article Research Support, N.I.H., Extramural |
GrantInformation_xml | – fundername: Technological Innovations in Neuroscience Award from the McKnight Foundation funderid: 10.13039/100005270 – fundername: Department of Radiology at Boston Children’s Hospital – fundername: National Institutes of Health (NIH) grantid: R01 EB018988; R01 NS106030 funderid: 10.13039/100000002 – fundername: NINDS NIH HHS grantid: R01 NS106030 – fundername: NIBIB NIH HHS grantid: R01 EB018988 |
GroupedDBID | --- -DZ -~X .GJ 0R~ 29I 4.4 53G 5GY 5RE 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT ACPRK AENEX AETIX AFRAH AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD F5P HZ~ H~9 IBMZZ ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNS RXW TAE TN5 VH1 AAYOK AAYXX CITATION RIG NPM 7QF 7QO 7QQ 7SC 7SE 7SP 7SR 7TA 7TB 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D NAPCQ P64 7X8 5PM |
ID | FETCH-LOGICAL-c402t-dde26711741dc782626584a3a587c15e28d338929caaa532246c11b9b9e739213 |
IEDL.DBID | RIE |
ISSN | 0278-0062 1558-254X |
IngestDate | Thu Aug 21 18:06:40 EDT 2025 Fri Jul 11 07:25:37 EDT 2025 Mon Jun 30 04:43:07 EDT 2025 Thu Apr 03 06:53:58 EDT 2025 Tue Jul 01 03:16:03 EDT 2025 Thu Apr 24 23:11:47 EDT 2025 Wed Aug 27 02:31:54 EDT 2025 |
IsPeerReviewed | false |
IsScholarly | true |
Issue | 11 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 Personal use of this material is permitted. However, permission to use this material for any other purposes must be obtained from the IEEE by sending a request to pubs-permissions@ieee.org. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c402t-dde26711741dc782626584a3a587c15e28d338929caaa532246c11b9b9e739213 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0001-6085-3580 0000-0002-3795-5623 0000-0001-7699-4564 |
PMID | 32746102 |
PQID | 2456527645 |
PQPubID | 85460 |
PageCount | 12 |
ParticipantIDs | pubmed_primary_32746102 proquest_miscellaneous_2430376230 crossref_citationtrail_10_1109_TMI_2020_2998600 proquest_journals_2456527645 pubmedcentral_primary_oai_pubmedcentral_nih_gov_7787194 crossref_primary_10_1109_TMI_2020_2998600 ieee_primary_9103624 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2020-11-01 |
PublicationDateYYYYMMDD | 2020-11-01 |
PublicationDate_xml | – month: 11 year: 2020 text: 2020-11-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: New York |
PublicationTitle | IEEE transactions on medical imaging |
PublicationTitleAbbrev | TMI |
PublicationTitleAlternate | IEEE Trans Med Imaging |
PublicationYear | 2020 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref13 ref56 ref12 ref59 ref15 ref58 ref14 ref53 ref55 ref11 ref54 ref10 hochreiter (ref52) 2001 ref17 ref16 ref19 ref18 he (ref51) 2016 ref50 ref46 ref45 ref48 ref47 ref42 ref44 ref43 mahendran (ref31) 2017; 1 ref49 piontelli (ref62) 2014 ref8 ref7 ref9 ref4 ref6 ref5 rumelhart (ref38) 1986; 323 ref35 ref34 ref37 ref36 ref30 ref33 ref32 ref2 ref1 ref39 ref24 ondruska (ref41) 2016 ref23 ref26 ref25 ref64 ref20 ref63 ref22 ref65 ref21 ref28 ref27 sutskever (ref40) 2014 ref29 (ref3) 2019 ref60 ref61 |
References_xml | – ident: ref4 doi: 10.1002/(SICI)1522-2594(199911)42:5<963::AID-MRM17>3.0.CO;2-L – start-page: 630 year: 2016 ident: ref51 article-title: Identity mappings in deep residual networks publication-title: Proc Eur Conf Comput Vis – ident: ref20 doi: 10.1007/978-3-030-00928-1_36 – volume: 323 start-page: 533 year: 1986 ident: ref38 article-title: Learning representations by back-propagating errors publication-title: Nature doi: 10.1038/323533a0 – ident: ref27 doi: 10.1007/978-3-319-46466-4_22 – ident: ref28 doi: 10.1109/ICRA.2017.7989233 – ident: ref26 doi: 10.1162/neco.1997.9.8.1735 – ident: ref29 doi: 10.1109/CVPR.2015.7298758 – start-page: 3104 year: 2014 ident: ref40 article-title: Sequence to sequence learning with neural networks publication-title: Proc Adv Neural Inf Process Syst – ident: ref24 doi: 10.1016/j.neuroimage.2017.04.004 – ident: ref18 doi: 10.1109/TMI.2017.2737081 – ident: ref39 doi: 10.1109/MITS.2014.2357038 – start-page: 3361 year: 2016 ident: ref41 article-title: Deep tracking: Seeing beyond seeing using recurrent neural networks publication-title: Proc AAAI Conf Artif Intell – ident: ref17 doi: 10.1109/TMI.2015.2415453 – ident: ref33 doi: 10.1109/CVPR.2018.00762 – ident: ref57 doi: 10.1109/TMI.2020.2974844 – ident: ref49 doi: 10.1109/TMI.2017.2721362 – ident: ref15 doi: 10.1109/TMI.2010.2051680 – ident: ref6 doi: 10.1002/mrm.24314 – ident: ref43 doi: 10.1007/978-3-319-46448-0_45 – ident: ref45 doi: 10.1109/CVPR.2018.00935 – ident: ref47 doi: 10.1109/LRA.2018.2792152 – ident: ref55 doi: 10.1109/IEMBS.2011.6091385 – ident: ref7 doi: 10.1002/mrm.22176 – ident: ref56 doi: 10.1109/CVPR.2017.497 – ident: ref8 doi: 10.1002/mrm.27934 – ident: ref32 doi: 10.1007/978-3-319-46484-8_29 – ident: ref63 doi: 10.3174/ajnr.A5694 – ident: ref54 doi: 10.1016/j.neuroimage.2006.01.015 – ident: ref36 doi: 10.1109/TMI.2018.2798801 – ident: ref21 doi: 10.1016/j.media.2017.04.010 – ident: ref11 doi: 10.1002/cmr.a.21321 – ident: ref22 doi: 10.1109/ISBI.2015.7163836 – ident: ref25 doi: 10.1109/TMI.2018.2866442 – ident: ref61 doi: 10.1007/s10237-015-0738-1 – ident: ref16 doi: 10.1016/j.media.2012.07.004 – ident: ref37 doi: 10.1109/TMI.2016.2555244 – ident: ref65 doi: 10.1097/RMR.0000000000000219 – ident: ref5 doi: 10.1002/1522-2594(200009)44:3<457::AID-MRM17>3.0.CO;2-R – ident: ref12 doi: 10.1016/j.acra.2006.05.003 – ident: ref46 doi: 10.1109/ISCAS.2017.8050867 – ident: ref48 doi: 10.1109/ISBI.2018.8363675 – year: 2019 ident: ref3 publication-title: Partnering With Families to Minimize Exposure to Anesthesia – ident: ref34 doi: 10.1109/CVPR.2018.00542 – ident: ref53 doi: 10.1016/j.neuroimage.2019.116324 – ident: ref1 doi: 10.3174/ajnr.A3128 – ident: ref59 doi: 10.1109/ICCP.2009.5284727 – ident: ref30 doi: 10.1109/ICCV.2015.308 – ident: ref19 doi: 10.1016/j.neuroimage.2017.04.033 – ident: ref44 doi: 10.1109/CVPR.2017.531 – ident: ref60 doi: 10.1109/IVS.2012.6232277 – ident: ref35 doi: 10.1007/978-3-319-66185-8_34 – ident: ref23 doi: 10.1038/s41598-017-00525-w – ident: ref42 doi: 10.1109/ITSC.2017.8317904 – ident: ref10 doi: 10.1002/mrm.27381 – ident: ref50 doi: 10.1007/3-540-46805-6_19 – start-page: 237 year: 2001 ident: ref52 article-title: Gradient flow in recurrent nets: The difficulty of learning long-term dependencies publication-title: A Field Guide to Dynamical Recurrent Neural Networks – ident: ref9 doi: 10.1002/mrm.27705 – volume: 1 start-page: 4 year: 2017 ident: ref31 article-title: 3D pose regression using convolutional neural networks publication-title: Proc IEEE Int Conf Comput Vis Workshops (ICCVW) – ident: ref2 doi: 10.1007/s00247-016-3677-9 – year: 2014 ident: ref62 publication-title: Development of Normal Fetal Movements – ident: ref64 doi: 10.1016/j.neubiorev.2018.06.001 – ident: ref13 doi: 10.1109/TMI.2007.895456 – ident: ref14 doi: 10.1109/TMI.2009.2030679 – ident: ref58 doi: 10.1002/mrm.27613 |
SSID | ssj0014509 |
Score | 2.4702454 |
Snippet | Fetal magnetic resonance imaging (MRI) is challenged by uncontrollable, large, and irregular fetal movements. It is, therefore, performed through visual... |
SourceID | pubmedcentral proquest pubmed crossref ieee |
SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 3523 |
SubjectTerms | Coders Convolutional neural network Diagnostic systems Dynamics Encoders-Decoders Feature extraction fetal MRI Fetuses Head Image acquisition Image quality image registration long short term memory Magnetic resonance imaging Medical imaging Monitoring Motion detection motion tracking MRI Navigation systems Neural networks Pose estimation prediction Predictions Pregnancy Real time recurrent neural network Recurrent neural networks Resonance Test sets Three-dimensional displays Tracking |
Title | Deep Predictive Motion Tracking in Magnetic Resonance Imaging: Application to Fetal Imaging |
URI | https://ieeexplore.ieee.org/document/9103624 https://www.ncbi.nlm.nih.gov/pubmed/32746102 https://www.proquest.com/docview/2456527645 https://www.proquest.com/docview/2430376230 https://pubmed.ncbi.nlm.nih.gov/PMC7787194 |
Volume | 39 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS-RAEC7Ug6wHn6vGFy14WdjMpDuviTdRBxUiHhSEPYROp2ZX1Ixo5uKvt6rzcBRZvA3pmpBQ1VVfpbq-AjhIJF3HQrpMxeIGEskP6jx0B6iTBE008uz0hvQyOrsJLm7D2xn43fXCIKI9fIY9_mlr-cXYTPhTWZ9CG_nbYBZmyczqXq2uYhCE9XEOxYyxXqTakqSX9K_Tc0oEldcj1zuIuJdtKgTZmSpfwcvPpySnws5wCdL2gevTJve9SZX3zOsnLsfvvtEyLDb4UxzVBrMCM1iuwsIUK-EqzKdNvX0N_pwgPomrZ77AflGkduiPoAhn-Bu7uCtFqv-W3AkpuBLA9B0ozh_t7KNDcfReHhfVWAyRoH67-hNuhqfXx2duM47BNZRkVi45QhXFklIYWRgCFpQKEXrRvg4HsZEhqkFB-S7BLaO1Dn1mqjNS5kmeYEwoTPrrMFeOS9wEUVAMzJWKR95IBYXxtc59zsRyVhPtWwf6rYYy03CV88iMh8zmLF6SkU4zFs0anTrwq_vHU83T8R_ZNdZEJ9cowYGd1giyZiO_ZLYurOIoCB3Y75ZpC3JdRZc4nrAM4QAKKj7deaO2me7ePmX9hFCVA_EHa-oEmN7740p598_SfMdk5DIJtr5-2m34we9Ut0XuwFz1PMFdwkdVvmc3xhvUOAp- |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT9xADLaASgUOffBo09J2KnFBanYzk9emN9R2tdsS1MMiIfUQTSZeQEAWQfbCr8eePLoghLhFGSdK5Bn783j8GWA3kXQfC-kyFYsbSCQ7qPPQHaBOEjTR1LPdG9LDaHQU_D4Oj5fgW1cLg4j28Bn2-NLm8ouZmfNWWZ9cG9nbYBlekN8Pwrpaq8sZBGF9oEMxZ6wXqTYp6SX9STqmUFB5PTK-g4ir2RackO2q8hjAfHhOcsHxDF9D2n5yfd7kvDev8p65fcDm-Nx_egOvGgQq9usp8xaWsNyA9QVewg14mTYZ90349xPxSvy95htsGUVq2_4I8nGGd9nFWSlSfVJyLaTgXAATeKAYX9ruR9_F_v8EuahmYogE9tvRLTga_pr8GLlNQwbXUJhZuWQKVRRLCmJkYQhaUDBE-EX7OhzERoaoBgVFvAS4jNY69JmrzkiZJ3mCMeEw6W_DSjkr8T2IgrxgrlQ89aYqKIyvde5zLJazmmjlOtBvNZSZhq2cm2ZcZDZq8ZKMdJqxaNbo1IG97omrmqnjCdlN1kQn1yjBgZ12EmTNUr7JbGZYxVEQOvC1G6ZFyJkVXeJszjKEBMit-PTmd_Wc6d7tU9xPGFU5EN-bTZ0AE3zfHynPTi3Rd0zWVCbBh8e_9gusjibpQXYwPvzzEdb4_-oiyR1Yqa7n-InQUpV_tovkDudCDcs |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deep+Predictive+Motion+Tracking+in+Magnetic+Resonance+Imaging%3A+Application+to+Fetal+Imaging&rft.jtitle=IEEE+transactions+on+medical+imaging&rft.au=Singh%2C+Ayush&rft.au=Seyed+Sadegh+Mohseni+Salehi&rft.au=Gholipour%2C+Ali&rft.date=2020-11-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0278-0062&rft.eissn=1558-254X&rft.volume=39&rft.issue=11&rft.spage=3523&rft_id=info:doi/10.1109%2FTMI.2020.2998600&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0278-0062&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0278-0062&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0278-0062&client=summon |