Deep learning enables automatic detection and segmentation of brain metastases on multisequence MRI
Background Detecting and segmenting brain metastases is a tedious and time‐consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging. Purpose To demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep‐learning a...
Saved in:
Published in | Journal of magnetic resonance imaging Vol. 51; no. 1; pp. 175 - 182 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Hoboken, USA
John Wiley & Sons, Inc
01.01.2020
Wiley Subscription Services, Inc |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Background
Detecting and segmenting brain metastases is a tedious and time‐consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging.
Purpose
To demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep‐learning approach based on a fully convolution neural network (CNN).
Study Type
Retrospective.
Population
In all, 156 patients with brain metastases from several primary cancers were included.
Field Strength
1.5T and 3T. [Correction added on May 24, 2019, after first online publication: In the preceding sentence, the first field strength listed was corrected.]
Sequence
Pretherapy MR images included pre‐ and postgadolinium T1‐weighted 3D fast spin echo (CUBE), postgadolinium T1‐weighted 3D axial IR‐prepped FSPGR (BRAVO), and 3D CUBE fluid attenuated inversion recovery (FLAIR).
Assessment
The ground truth was established by manual delineation by two experienced neuroradiologists. CNN training/development was performed using 100 and 5 patients, respectively, with a 2.5D network based on a GoogLeNet architecture. The results were evaluated in 51 patients, equally separated into those with few (1–3), multiple (4–10), and many (>10) lesions.
Statistical Tests
Network performance was evaluated using precision, recall, Dice/F1 score, and receiver operating characteristic (ROC) curve statistics. For an optimal probability threshold, detection and segmentation performance was assessed on a per‐metastasis basis. The Wilcoxon rank sum test was used to test the differences between patient subgroups.
Results
The area under the ROC curve (AUC), averaged across all patients, was 0.98 ± 0.04. The AUC in the subgroups was 0.99 ± 0.01, 0.97 ± 0.05, and 0.97 ± 0.03 for patients having 1–3, 4–10, and >10 metastases, respectively. Using an average optimal probability threshold determined by the development set, precision, recall, and Dice score were 0.79 ± 0.20, 0.53 ± 0.22, and 0.79 ± 0.12, respectively. At the same probability threshold, the network showed an average false‐positive rate of 8.3/patient (no lesion‐size limit) and 3.4/patient (10 mm3 lesion size limit).
Data Conclusion
A deep‐learning approach using multisequence MRI can automatically detect and segment brain metastases with high accuracy.
Level of Evidence: 3
Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2020;51:175–182. |
---|---|
AbstractList | Detecting and segmenting brain metastases is a tedious and time-consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging.
To demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep-learning approach based on a fully convolution neural network (CNN).
Retrospective.
In all, 156 patients with brain metastases from several primary cancers were included.
1.5T and 3T. [Correction added on May 24, 2019, after first online publication: In the preceding sentence, the first field strength listed was corrected.] SEQUENCE: Pretherapy MR images included pre- and postgadolinium T
-weighted 3D fast spin echo (CUBE), postgadolinium T
-weighted 3D axial IR-prepped FSPGR (BRAVO), and 3D CUBE fluid attenuated inversion recovery (FLAIR).
The ground truth was established by manual delineation by two experienced neuroradiologists. CNN training/development was performed using 100 and 5 patients, respectively, with a 2.5D network based on a GoogLeNet architecture. The results were evaluated in 51 patients, equally separated into those with few (1-3), multiple (4-10), and many (>10) lesions.
Network performance was evaluated using precision, recall, Dice/F1 score, and receiver operating characteristic (ROC) curve statistics. For an optimal probability threshold, detection and segmentation performance was assessed on a per-metastasis basis. The Wilcoxon rank sum test was used to test the differences between patient subgroups.
The area under the ROC curve (AUC), averaged across all patients, was 0.98 ± 0.04. The AUC in the subgroups was 0.99 ± 0.01, 0.97 ± 0.05, and 0.97 ± 0.03 for patients having 1-3, 4-10, and >10 metastases, respectively. Using an average optimal probability threshold determined by the development set, precision, recall, and Dice score were 0.79 ± 0.20, 0.53 ± 0.22, and 0.79 ± 0.12, respectively. At the same probability threshold, the network showed an average false-positive rate of 8.3/patient (no lesion-size limit) and 3.4/patient (10 mm
lesion size limit).
A deep-learning approach using multisequence MRI can automatically detect and segment brain metastases with high accuracy.
3 Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2020;51:175-182. Background Detecting and segmenting brain metastases is a tedious and time‐consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging. Purpose To demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep‐learning approach based on a fully convolution neural network (CNN). Study Type Retrospective. Population In all, 156 patients with brain metastases from several primary cancers were included. Field Strength 1.5T and 3T. [Correction added on May 24, 2019, after first online publication: In the preceding sentence, the first field strength listed was corrected.] Sequence Pretherapy MR images included pre‐ and postgadolinium T1‐weighted 3D fast spin echo (CUBE), postgadolinium T1‐weighted 3D axial IR‐prepped FSPGR (BRAVO), and 3D CUBE fluid attenuated inversion recovery (FLAIR). Assessment The ground truth was established by manual delineation by two experienced neuroradiologists. CNN training/development was performed using 100 and 5 patients, respectively, with a 2.5D network based on a GoogLeNet architecture. The results were evaluated in 51 patients, equally separated into those with few (1–3), multiple (4–10), and many (>10) lesions. Statistical Tests Network performance was evaluated using precision, recall, Dice/F1 score, and receiver operating characteristic (ROC) curve statistics. For an optimal probability threshold, detection and segmentation performance was assessed on a per‐metastasis basis. The Wilcoxon rank sum test was used to test the differences between patient subgroups. Results The area under the ROC curve (AUC), averaged across all patients, was 0.98 ± 0.04. The AUC in the subgroups was 0.99 ± 0.01, 0.97 ± 0.05, and 0.97 ± 0.03 for patients having 1–3, 4–10, and >10 metastases, respectively. Using an average optimal probability threshold determined by the development set, precision, recall, and Dice score were 0.79 ± 0.20, 0.53 ± 0.22, and 0.79 ± 0.12, respectively. At the same probability threshold, the network showed an average false‐positive rate of 8.3/patient (no lesion‐size limit) and 3.4/patient (10 mm3 lesion size limit). Data Conclusion A deep‐learning approach using multisequence MRI can automatically detect and segment brain metastases with high accuracy. Level of Evidence: 3 Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2020;51:175–182. BackgroundDetecting and segmenting brain metastases is a tedious and time‐consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging.PurposeTo demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep‐learning approach based on a fully convolution neural network (CNN).Study TypeRetrospective.PopulationIn all, 156 patients with brain metastases from several primary cancers were included.Field Strength1.5T and 3T. [Correction added on May 24, 2019, after first online publication: In the preceding sentence, the first field strength listed was corrected.]SequencePretherapy MR images included pre‐ and postgadolinium T1‐weighted 3D fast spin echo (CUBE), postgadolinium T1‐weighted 3D axial IR‐prepped FSPGR (BRAVO), and 3D CUBE fluid attenuated inversion recovery (FLAIR).AssessmentThe ground truth was established by manual delineation by two experienced neuroradiologists. CNN training/development was performed using 100 and 5 patients, respectively, with a 2.5D network based on a GoogLeNet architecture. The results were evaluated in 51 patients, equally separated into those with few (1–3), multiple (4–10), and many (>10) lesions.Statistical TestsNetwork performance was evaluated using precision, recall, Dice/F1 score, and receiver operating characteristic (ROC) curve statistics. For an optimal probability threshold, detection and segmentation performance was assessed on a per‐metastasis basis. The Wilcoxon rank sum test was used to test the differences between patient subgroups.ResultsThe area under the ROC curve (AUC), averaged across all patients, was 0.98 ± 0.04. The AUC in the subgroups was 0.99 ± 0.01, 0.97 ± 0.05, and 0.97 ± 0.03 for patients having 1–3, 4–10, and >10 metastases, respectively. Using an average optimal probability threshold determined by the development set, precision, recall, and Dice score were 0.79 ± 0.20, 0.53 ± 0.22, and 0.79 ± 0.12, respectively. At the same probability threshold, the network showed an average false‐positive rate of 8.3/patient (no lesion‐size limit) and 3.4/patient (10 mm3 lesion size limit).Data ConclusionA deep‐learning approach using multisequence MRI can automatically detect and segment brain metastases with high accuracy.Level of Evidence: 3Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2020;51:175–182. Detecting and segmenting brain metastases is a tedious and time-consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging.BACKGROUNDDetecting and segmenting brain metastases is a tedious and time-consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging.To demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep-learning approach based on a fully convolution neural network (CNN).PURPOSETo demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep-learning approach based on a fully convolution neural network (CNN).Retrospective.STUDY TYPERetrospective.In all, 156 patients with brain metastases from several primary cancers were included.POPULATIONIn all, 156 patients with brain metastases from several primary cancers were included.1.5T and 3T. [Correction added on May 24, 2019, after first online publication: In the preceding sentence, the first field strength listed was corrected.] SEQUENCE: Pretherapy MR images included pre- and postgadolinium T1 -weighted 3D fast spin echo (CUBE), postgadolinium T1 -weighted 3D axial IR-prepped FSPGR (BRAVO), and 3D CUBE fluid attenuated inversion recovery (FLAIR).FIELD STRENGTH1.5T and 3T. [Correction added on May 24, 2019, after first online publication: In the preceding sentence, the first field strength listed was corrected.] SEQUENCE: Pretherapy MR images included pre- and postgadolinium T1 -weighted 3D fast spin echo (CUBE), postgadolinium T1 -weighted 3D axial IR-prepped FSPGR (BRAVO), and 3D CUBE fluid attenuated inversion recovery (FLAIR).The ground truth was established by manual delineation by two experienced neuroradiologists. CNN training/development was performed using 100 and 5 patients, respectively, with a 2.5D network based on a GoogLeNet architecture. The results were evaluated in 51 patients, equally separated into those with few (1-3), multiple (4-10), and many (>10) lesions.ASSESSMENTThe ground truth was established by manual delineation by two experienced neuroradiologists. CNN training/development was performed using 100 and 5 patients, respectively, with a 2.5D network based on a GoogLeNet architecture. The results were evaluated in 51 patients, equally separated into those with few (1-3), multiple (4-10), and many (>10) lesions.Network performance was evaluated using precision, recall, Dice/F1 score, and receiver operating characteristic (ROC) curve statistics. For an optimal probability threshold, detection and segmentation performance was assessed on a per-metastasis basis. The Wilcoxon rank sum test was used to test the differences between patient subgroups.STATISTICAL TESTSNetwork performance was evaluated using precision, recall, Dice/F1 score, and receiver operating characteristic (ROC) curve statistics. For an optimal probability threshold, detection and segmentation performance was assessed on a per-metastasis basis. The Wilcoxon rank sum test was used to test the differences between patient subgroups.The area under the ROC curve (AUC), averaged across all patients, was 0.98 ± 0.04. The AUC in the subgroups was 0.99 ± 0.01, 0.97 ± 0.05, and 0.97 ± 0.03 for patients having 1-3, 4-10, and >10 metastases, respectively. Using an average optimal probability threshold determined by the development set, precision, recall, and Dice score were 0.79 ± 0.20, 0.53 ± 0.22, and 0.79 ± 0.12, respectively. At the same probability threshold, the network showed an average false-positive rate of 8.3/patient (no lesion-size limit) and 3.4/patient (10 mm3 lesion size limit).RESULTSThe area under the ROC curve (AUC), averaged across all patients, was 0.98 ± 0.04. The AUC in the subgroups was 0.99 ± 0.01, 0.97 ± 0.05, and 0.97 ± 0.03 for patients having 1-3, 4-10, and >10 metastases, respectively. Using an average optimal probability threshold determined by the development set, precision, recall, and Dice score were 0.79 ± 0.20, 0.53 ± 0.22, and 0.79 ± 0.12, respectively. At the same probability threshold, the network showed an average false-positive rate of 8.3/patient (no lesion-size limit) and 3.4/patient (10 mm3 lesion size limit).A deep-learning approach using multisequence MRI can automatically detect and segment brain metastases with high accuracy.DATA CONCLUSIONA deep-learning approach using multisequence MRI can automatically detect and segment brain metastases with high accuracy.3 Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2020;51:175-182.LEVEL OF EVIDENCE3 Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2020;51:175-182. |
Author | Yi, Darvin Grøvik, Endre Iv, Michael Rubin, Daniel Zaharchuk, Greg Tong, Elizabeth |
AuthorAffiliation | 2 Department for Diagnostic Physics, Oslo University Hospital, Oslo, Norway 3 Department of Biomedical Data Science, Stanford University, Stanford, California, USA 1 Department of Radiology, Stanford University, Stanford, California, USA |
AuthorAffiliation_xml | – name: 1 Department of Radiology, Stanford University, Stanford, California, USA – name: 2 Department for Diagnostic Physics, Oslo University Hospital, Oslo, Norway – name: 3 Department of Biomedical Data Science, Stanford University, Stanford, California, USA |
Author_xml | – sequence: 1 givenname: Endre orcidid: 0000-0002-9925-1162 surname: Grøvik fullname: Grøvik, Endre organization: Oslo University Hospital – sequence: 2 givenname: Darvin surname: Yi fullname: Yi, Darvin organization: Stanford University – sequence: 3 givenname: Michael surname: Iv fullname: Iv, Michael organization: Stanford University – sequence: 4 givenname: Elizabeth surname: Tong fullname: Tong, Elizabeth organization: Stanford University – sequence: 5 givenname: Daniel surname: Rubin fullname: Rubin, Daniel organization: Stanford University – sequence: 6 givenname: Greg surname: Zaharchuk fullname: Zaharchuk, Greg email: gregz@stanford.edu organization: Stanford University |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/31050074$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kV1rFTEQhoNU7Ife-AMk4I0IW_Oxye7eCFK_KhVB9DrMyc4ec9hNjsmu0n_vtNuKFpEEEmaed3hn5pgdxBSRscdSnEoh1IvdlMOpso2199iRNEpVyrT2gP7C6Eq2ojlkx6XshBBdV5sH7FBTRoimPmL-NeKejwg5hrjlGGEzYuGwzGmCOXje44x-DilyiD0vuJ0wznAdSAPfZAiRTzhDoUtCCk_LOIeC3xeMHvnHz-cP2f0BxoKPbt4T9vXtmy9n76uLT-_Oz15dVL5uWlvVBrTtam3RKjGA9Y2yYHU9mF6KvpeNsWAAoEXpwQxaCYmCCGqpHWSH-oS9XOvul82EvSejGUa3z2GCfOkSBPd3JoZvbpt-uEbSXDpLBZ7dFMiJ7JfZTaF4HEeImJbilFKd0nXbGUKf3kF3acmR2nNK02mMsTVRT_509NvK7fwJECvgcyol4-B8WKdLBsPopHBXK3ZXK3bXKybJ8zuS26r_hOUK_wwjXv6HdB9oUavmFzq8uAg |
CitedBy_id | crossref_primary_10_3390_bioengineering10040492 crossref_primary_10_1016_j_ejmp_2021_03_009 crossref_primary_10_1016_j_ijrobp_2022_06_081 crossref_primary_10_1038_s41597_024_03634_0 crossref_primary_10_1080_07357907_2022_2044842 crossref_primary_10_1016_j_jacr_2023_06_034 crossref_primary_10_1088_1361_6560_abca53 crossref_primary_10_1007_s00330_021_08427_2 crossref_primary_10_3389_fonc_2024_1338225 crossref_primary_10_1002_mp_15863 crossref_primary_10_1007_s00234_024_03454_4 crossref_primary_10_52589_BJCNIT_LOYYI2RS crossref_primary_10_1109_TCDS_2022_3213944 crossref_primary_10_1038_s41746_021_00398_4 crossref_primary_10_1093_neuonc_noaa232 crossref_primary_10_3174_ajnr_A7998 crossref_primary_10_1148_ryai_2021200204 crossref_primary_10_3390_diagnostics12082023 crossref_primary_10_1016_j_biosx_2022_100188 crossref_primary_10_1016_j_seizure_2021_05_023 crossref_primary_10_1007_s00521_024_10334_8 crossref_primary_10_1007_s00234_021_02743_6 crossref_primary_10_1016_j_patcog_2023_109651 crossref_primary_10_1002_jmri_27741 crossref_primary_10_1002_ima_70042 crossref_primary_10_1088_1361_6560_ac1835 crossref_primary_10_1088_1361_6560_acace7 crossref_primary_10_1002_jmri_28272 crossref_primary_10_1007_s00264_023_05875_x crossref_primary_10_1088_2057_1976_ac9b5b crossref_primary_10_1002_jmri_28274 crossref_primary_10_3389_fneur_2020_00270 crossref_primary_10_1093_neuonc_noac025 crossref_primary_10_13104_imri_2021_25_4_266 crossref_primary_10_1007_s00066_020_01663_3 crossref_primary_10_1016_j_ijrobp_2024_07_2318 crossref_primary_10_3389_fonc_2021_739639 crossref_primary_10_1371_journal_pone_0241796 crossref_primary_10_3390_brainsci13101495 crossref_primary_10_1186_s13014_020_01514_6 crossref_primary_10_1109_JBHI_2020_2982103 crossref_primary_10_1148_ryai_230126 crossref_primary_10_1148_ryai_230520 crossref_primary_10_3389_fbioe_2023_1239637 crossref_primary_10_1007_s11831_022_09758_z crossref_primary_10_1007_s13755_024_00330_6 crossref_primary_10_3349_ymj_2023_0590 crossref_primary_10_3390_cancers16020415 crossref_primary_10_1117_1_JMI_8_3_037001 crossref_primary_10_1038_s41598_021_04354_w crossref_primary_10_3390_cancers14133264 crossref_primary_10_1007_s10278_023_00856_3 crossref_primary_10_1002_jmri_27131 crossref_primary_10_1016_j_ins_2023_01_016 crossref_primary_10_3389_fneur_2020_00001 crossref_primary_10_1093_noajnl_vdae060 crossref_primary_10_1002_jmri_27129 crossref_primary_10_1007_s11227_020_03535_0 crossref_primary_10_1007_s00521_024_10919_3 crossref_primary_10_1002_cpe_6821 crossref_primary_10_1007_s00330_023_10120_5 crossref_primary_10_1038_s41598_022_23687_8 crossref_primary_10_1002_jmri_28456 crossref_primary_10_3389_fbinf_2022_999700 crossref_primary_10_1142_S0219519423400985 crossref_primary_10_1016_j_radonc_2020_09_016 crossref_primary_10_26416_Med_145_1_2022_6215 crossref_primary_10_1038_s41467_024_52414_2 crossref_primary_10_3390_cancers14092069 crossref_primary_10_3390_cancers14102555 crossref_primary_10_1109_TPDS_2023_3240174 crossref_primary_10_1097_RLI_0000000000000745 crossref_primary_10_1016_j_compmedimag_2024_102401 crossref_primary_10_3348_kjr_2023_0671 crossref_primary_10_3390_app11199180 crossref_primary_10_1016_j_eclinm_2020_100669 crossref_primary_10_1111_jon_12916 crossref_primary_10_1186_s41747_025_00554_5 crossref_primary_10_3389_fneur_2022_932219 crossref_primary_10_3174_ajnr_A6982 crossref_primary_10_1186_s13014_023_02246_z crossref_primary_10_1093_neuonc_noae113 crossref_primary_10_1016_j_neucom_2024_128583 crossref_primary_10_1007_s00330_023_10318_7 crossref_primary_10_1038_s41598_023_42048_7 crossref_primary_10_1186_s12859_020_03936_1 crossref_primary_10_1016_j_crad_2023_07_009 crossref_primary_10_1259_bjr_20220841 crossref_primary_10_1016_j_ejrad_2021_109577 crossref_primary_10_1002_mp_15136 crossref_primary_10_1016_j_semradonc_2022_06_002 crossref_primary_10_1016_j_ijrobp_2020_06_026 crossref_primary_10_1186_s40644_024_00669_9 crossref_primary_10_1002_mp_15534 crossref_primary_10_1016_j_radonc_2024_110419 crossref_primary_10_3389_fonc_2023_1273013 crossref_primary_10_3390_cancers13195010 crossref_primary_10_1016_j_ymeth_2020_06_003 crossref_primary_10_1016_j_radonc_2022_11_014 crossref_primary_10_1093_noajnl_vdac138 crossref_primary_10_1016_j_adro_2022_101085 crossref_primary_10_1016_j_imu_2024_101475 crossref_primary_10_2463_mrms_mp_2024_0082 crossref_primary_10_3390_bioengineering11050454 crossref_primary_10_1016_j_semradonc_2022_06_007 crossref_primary_10_3389_fneur_2022_905761 crossref_primary_10_1002_jcu_23558 crossref_primary_10_3390_jimaging10120319 crossref_primary_10_3390_diagnostics11122181 crossref_primary_10_3390_diagnostics13040668 crossref_primary_10_1097_RLI_0000000000001115 crossref_primary_10_3389_fninf_2022_1056068 crossref_primary_10_1007_s11060_022_04234_x crossref_primary_10_1016_j_radonc_2023_110007 crossref_primary_10_1142_S0218001423560013 crossref_primary_10_3390_diagnostics11091676 crossref_primary_10_1093_neuonc_noab071 crossref_primary_10_1109_ACCESS_2021_3132046 crossref_primary_10_1007_s00330_023_09648_3 crossref_primary_10_1186_s13244_021_01044_z crossref_primary_10_1007_s00234_021_02649_3 crossref_primary_10_1038_s41598_020_64912_6 crossref_primary_10_3390_cancers13112557 crossref_primary_10_3390_app9163335 crossref_primary_10_3390_biomedicines12112561 crossref_primary_10_1016_j_ijrobp_2022_09_068 crossref_primary_10_1038_s41598_023_31403_3 crossref_primary_10_26634_jcom_11_2_20132 crossref_primary_10_1186_s40478_023_01509_w crossref_primary_10_1093_noajnl_vdac081 crossref_primary_10_3390_diagnostics11061016 crossref_primary_10_1016_j_media_2023_103044 crossref_primary_10_1016_j_mri_2021_12_007 crossref_primary_10_1038_s41698_024_00789_2 crossref_primary_10_2217_cns_2020_0003 crossref_primary_10_1016_j_acra_2023_05_010 crossref_primary_10_1109_JBHI_2022_3153394 crossref_primary_10_3390_diagnostics13162670 crossref_primary_10_1186_s40644_024_00753_0 crossref_primary_10_1007_s00330_023_09420_7 crossref_primary_10_13104_imri_2022_26_1_1 crossref_primary_10_3389_fradi_2021_713681 crossref_primary_10_1002_jmri_29101 crossref_primary_10_3389_fonc_2021_773299 crossref_primary_10_1007_s00330_021_07783_3 crossref_primary_10_1016_j_ejrad_2021_110015 crossref_primary_10_3174_ajnr_A7380 crossref_primary_10_1007_s41870_023_01572_5 crossref_primary_10_1007_s00234_022_02902_3 crossref_primary_10_1038_s41597_023_02123_0 |
Cites_doi | 10.1093/neuonc/nox077 10.1007/s11912-011-0203-y 10.1016/j.compbiomed.2018.02.004 10.1016/j.media.2016.10.004 10.1093/jrr/rrs053 10.1371/journal.pone.0178265 10.1016/S1470-2045(15)70057-4 10.1088/0031-9155/61/24/8440 10.1109/TMI.2016.2538465 10.1002/(SICI)1097-0142(19961015)78:8<1781::AID-CNCR19>3.0.CO;2-U 10.1002/hbm.10062 10.1093/neuonc/now127 10.1007/s10278-015-9839-8 10.1186/1756-9966-30-10 10.1016/j.media.2017.10.002 10.1007/s10278-017-9983-4 10.3174/ajnr.A5747 10.1016/j.procs.2016.09.407 10.1118/1.4898200 10.1111/cgf.12193 10.1371/journal.pone.0185844 10.1259/bjr.20110718 |
ContentType | Journal Article |
Copyright | 2019 International Society for Magnetic Resonance in Medicine 2019 International Society for Magnetic Resonance in Medicine. 2020 International Society for Magnetic Resonance in Medicine |
Copyright_xml | – notice: 2019 International Society for Magnetic Resonance in Medicine – notice: 2019 International Society for Magnetic Resonance in Medicine. – notice: 2020 International Society for Magnetic Resonance in Medicine |
DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 7QO 7TK 8FD FR3 K9. P64 7X8 5PM |
DOI | 10.1002/jmri.26766 |
DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Biotechnology Research Abstracts Neurosciences Abstracts Technology Research Database Engineering Research Database ProQuest Health & Medical Complete (Alumni) Biotechnology and BioEngineering Abstracts MEDLINE - Academic PubMed Central (Full Participant titles) |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) ProQuest Health & Medical Complete (Alumni) Engineering Research Database Biotechnology Research Abstracts Technology Research Database Neurosciences Abstracts Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitleList | MEDLINE ProQuest Health & Medical Complete (Alumni) MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Medicine |
EISSN | 1522-2586 |
EndPage | 182 |
ExternalDocumentID | PMC7199496 31050074 10_1002_jmri_26766 JMRI26766 |
Genre | article Research Support, Non-U.S. Gov't Journal Article |
GrantInformation_xml | – fundername: Norges Forskningsråd funderid: 261984 – fundername: Kreftforeningen funderid: 3434180; 6817564 – fundername: Helse Sør‐Øst RHF funderid: 2013069; 2016102 – fundername: NLM NIH HHS grantid: T15 LM007033 |
GroupedDBID | --- -DZ .3N .GA .GJ .Y3 05W 0R~ 10A 1L6 1OB 1OC 1ZS 24P 31~ 33P 3O- 3SF 3WU 4.4 4ZD 50Y 50Z 51W 51X 52M 52N 52O 52P 52R 52S 52T 52U 52V 52W 52X 53G 5GY 5RE 5VS 66C 702 7PT 8-0 8-1 8-3 8-4 8-5 8UM 930 A01 A03 AAESR AAEVG AAHHS AAHQN AAIPD AAMNL AANHP AANLZ AAONW AASGY AAWTL AAXRX AAYCA AAZKR ABCQN ABCUV ABEML ABIJN ABJNI ABLJU ABOCM ABPVW ABQWH ABXGK ACAHQ ACBWZ ACCFJ ACCZN ACGFO ACGFS ACGOF ACIWK ACMXC ACPOU ACPRK ACRPL ACSCC ACXBN ACXQS ACYXJ ADBBV ADBTR ADEOM ADIZJ ADKYN ADMGS ADNMO ADOZA ADXAS ADZMN AEEZP AEGXH AEIGN AEIMD AENEX AEQDE AEUQT AEUYR AFBPY AFFPM AFGKR AFPWT AFRAH AFWVQ AFZJQ AHBTC AHMBA AIACR AIAGR AITYG AIURR AIWBW AJBDE ALAGY ALMA_UNASSIGNED_HOLDINGS ALUQN ALVPJ AMBMR AMYDB ASPBG ATUGU AVWKF AZBYB AZFZN AZVAB BAFTC BDRZF BFHJK BHBCM BMXJE BROTX BRXPI BY8 C45 CS3 D-6 D-7 D-E D-F DCZOG DPXWK DR2 DRFUL DRMAN DRSTM DU5 EBD EBS EJD EMOBN F00 F01 F04 F5P FEDTE FUBAC G-S G.N GNP GODZA H.X HBH HDBZQ HF~ HGLYW HHY HHZ HVGLF HZ~ IX1 J0M JPC KBYEO KQQ LATKE LAW LC2 LC3 LEEKS LH4 LITHE LOXES LP6 LP7 LUTES LW6 LYRES M65 MEWTI MK4 MRFUL MRMAN MRSTM MSFUL MSMAN MSSTM MXFUL MXMAN MXSTM N04 N05 N9A NF~ NNB O66 O9- OIG OVD P2P P2W P2X P2Z P4B P4D PALCI PQQKQ Q.N Q11 QB0 QRW R.K RGB RIWAO RJQFR ROL RWI RX1 RYL SAMSI SUPJJ SV3 TEORI TWZ UB1 V2E V8K V9Y W8V W99 WBKPD WHWMO WIB WIH WIJ WIK WIN WJL WOHZO WQJ WRC WUP WVDHM WXI WXSBR XG1 XV2 ZXP ZZTAW ~IA ~WT AAYXX AEYWJ AGHNM AGQPQ AGYGG CITATION CGR CUY CVF ECM EIF NPM 7QO 7TK 8FD AAMMB AEFGJ AGXDD AIDQK AIDYY FR3 K9. P64 7X8 5PM |
ID | FETCH-LOGICAL-c4786-45a369436e620fa6c726a634f5d10dd1756a5aaa8e1ca5f3201e06a69458f19e3 |
IEDL.DBID | DR2 |
ISSN | 1053-1807 1522-2586 |
IngestDate | Thu Aug 21 14:10:14 EDT 2025 Fri Jul 11 08:30:05 EDT 2025 Fri Jul 25 10:32:47 EDT 2025 Wed Feb 19 02:28:00 EST 2025 Tue Jul 01 03:56:42 EDT 2025 Thu Apr 24 22:57:21 EDT 2025 Wed Jan 22 16:35:59 EST 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Keywords | deep learning segmentation multisequence brain metastases |
Language | English |
License | 2019 International Society for Magnetic Resonance in Medicine. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c4786-45a369436e620fa6c726a634f5d10dd1756a5aaa8e1ca5f3201e06a69458f19e3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 E.G and D.Y are Co-First authors. |
ORCID | 0000-0002-9925-1162 |
PMID | 31050074 |
PQID | 2323275564 |
PQPubID | 1006400 |
PageCount | 8 |
ParticipantIDs | pubmedcentral_primary_oai_pubmedcentral_nih_gov_7199496 proquest_miscellaneous_2229234895 proquest_journals_2323275564 pubmed_primary_31050074 crossref_citationtrail_10_1002_jmri_26766 crossref_primary_10_1002_jmri_26766 wiley_primary_10_1002_jmri_26766_JMRI26766 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | January 2020 |
PublicationDateYYYYMMDD | 2020-01-01 |
PublicationDate_xml | – month: 01 year: 2020 text: January 2020 |
PublicationDecade | 2020 |
PublicationPlace | Hoboken, USA |
PublicationPlace_xml | – name: Hoboken, USA – name: United States – name: Nashville |
PublicationSubtitle | JMRI |
PublicationTitle | Journal of magnetic resonance imaging |
PublicationTitleAlternate | J Magn Reson Imaging |
PublicationYear | 2020 |
Publisher | John Wiley & Sons, Inc Wiley Subscription Services, Inc |
Publisher_xml | – name: John Wiley & Sons, Inc – name: Wiley Subscription Services, Inc |
References | 2002; 17 2015; 16 2013; 86 2016; 102 2011; 30 1412 2016; 18 2012; 14 2014; 41 2018; 43 2016; 35 1996; 78 2017; 30 2018; 39 2017; 36 2013; 54 2013; 32 2017; 12 2017 2017; 19 2016 2016; 61 2015 2018; 95 2016; 29 Zhao L (e_1_2_6_14_1) 2015 Szegedy C (e_1_2_6_21_1) 2015 e_1_2_6_32_1 e_1_2_6_31_1 e_1_2_6_30_1 Lin T‐Y (e_1_2_6_29_1) 2017 Grossiord E (e_1_2_6_10_1) Huang G (e_1_2_6_26_1) 2017 e_1_2_6_19_1 Ronneberger O (e_1_2_6_27_1) 2015 e_1_2_6_13_1 e_1_2_6_35_1 e_1_2_6_11_1 e_1_2_6_34_1 e_1_2_6_18_1 e_1_2_6_15_1 e_1_2_6_16_1 Yoon BC (e_1_2_6_33_1) 2018; 39 e_1_2_6_20_1 Liu Y (e_1_2_6_17_1) 2017; 12 Havaei M (e_1_2_6_12_1) 2016 Kingma DP (e_1_2_6_22_1) He K (e_1_2_6_25_1) 2016 Zhao H (e_1_2_6_28_1) 2017 e_1_2_6_5_1 e_1_2_6_4_1 e_1_2_6_7_1 e_1_2_6_6_1 e_1_2_6_24_1 e_1_2_6_3_1 e_1_2_6_23_1 e_1_2_6_2_1 Goceri E (e_1_2_6_8_1) 2017 Moghbel M (e_1_2_6_9_1) 2017 |
References_xml | – volume: 16 start-page: e270 year: 2015 end-page: e278 article-title: Response assessment criteria for brain metastases: Proposal from the RANO group publication-title: Lancet Oncol – volume: 19 start-page: 1511 year: 2017 end-page: 1521 article-title: Incidence and prognosis of patients with brain metastases at diagnosis of systemic malignancy: A population‐based study publication-title: Neuro Oncol – volume: 35 start-page: 1240 year: 2016 end-page: 1251 article-title: Brain tumor segmentation using convolutional neural networks in MRI images publication-title: IEEE Trans Med Imaging – volume: 12 start-page: e0178265 year: 2017 article-title: Computer‐aided detection of brain metastasis on 3D MR imaging: Observer performance study publication-title: PLoS One – start-page: 306 year: 2015 end-page: 309 article-title: Deep feature learning with discrimination mechanism for brain tumor segmentation and diagnosis publication-title: IEEE Int Conf Intell Inf Hiding Multimed Signal Process – volume: 61 start-page: 8440 year: 2016 end-page: 8461 article-title: Automatic metastatic brain tumor segmentation for stereotactic radiosurgery applications publication-title: Phys Med Biol – start-page: 6230 year: 2017 end-page: 6239 article-title: Pyramid scene parsing network publication-title: IEEE Conf Comput Vis Pattern Recognit – volume: 39 start-page: 1635 year: 2018 end-page: 1642 article-title: Evaluation of thick‐slab overlapping MIP images of contrast‐enhanced 3D T1‐weighted CUBE for detection of intracranial metastases: A pilot study for comparison of lesion detection, interpretation time, and sensitivity with nonoverlapping CUBE MIP, CUBE, and inversion‐recovery‐prepared fast‐spoiled gradient recalled brain volume publication-title: Am J Neuroradiol – start-page: 1 year: 2015 end-page: 9 article-title: Going deeper with convolutions publication-title: IEEE Conf Comput Vis Pattern Recognit – volume: 2017 start-page: 174 end-page: 178 – start-page: 936 year: 2017 end-page: 944 article-title: Feature pyramid networks for object detection publication-title: IEEE Conf Comput Vis Pattern Recognit – start-page: 2261 year: 2017 end-page: 2269 article-title: Densely connected convolutional networks publication-title: IEEE Conf Comput Vis Pattern Recognit – volume: 41 start-page: 111715 year: 2014 article-title: Interactive prostate segmentation using atlas‐guided semi‐supervised learning and adaptive feature selection publication-title: Med Phys – volume: 36 start-page: 61 year: 2017 end-page: 78 article-title: Efficient multi‐scale 3D CNN with fully connected CRF for accurate brain lesion segmentation publication-title: Med Image Anal – volume: 17 start-page: 143 year: 2002 end-page: 155 article-title: Fast robust automated brain extraction publication-title: Hum Brain Mapp – volume: 86 start-page: 20110718 year: 2013 article-title: Automated delineation of radiotherapy volumes: Are we going in the right direction? publication-title: Br J Radiol – volume: 14 start-page: 48 year: 2012 end-page: 54 article-title: Epidemiology of brain metastases publication-title: Curr Oncol Rep – start-page: 125 year: 2016 end-page: 148 article-title: Deep learning trends for focal brain pathology segmentation in MRI publication-title: Mach Learn Heal Inform – start-page: 770 year: 2016 end-page: 778 article-title: Deep residual learning for image recognition publication-title: IEEE Conf Comput Vis Pattern Recognit – volume: 29 start-page: 264 year: 2016 end-page: 277 article-title: User interaction in semi‐automatic segmentation of organs at risk: A case study in radiotherapy publication-title: J Digit Imaging – volume: 43 start-page: 98 year: 2018 end-page: 111 article-title: A deep learning model integrating FCNNs and CRFs for brain tumor segmentation publication-title: Med Image Anal – volume: 32 start-page: 144 year: 2013 end-page: 157 article-title: Sketch‐based editing tools for tumour segmentation in 3D medical images publication-title: Comput Graph Forum – volume: 78 start-page: 1781 year: 1996 end-page: 1788 article-title: Brain metastases: Histology, multiplicity, surgery, and survival publication-title: Cancer – start-page: 1 year: 2017 end-page: 41 article-title: Review of liver segmentation and computer assisted detection/diagnosis methods in computed tomography publication-title: Artif Intell Rev – volume: 30 start-page: 449 year: 2017 end-page: 459 article-title: Deep learning for brain MRI segmentation: State of the art and future directions publication-title: J Digit Imaging – start-page: 6980 year: 1412 publication-title: Ba JA. A method for stochastic optimization. arXiv Prepr 2014:arXiv – volume: 102 start-page: 317 year: 2016 end-page: 324 article-title: Review of MRI‐based brain tumor image segmentation using deep learning methods publication-title: Procedia Comput Sci – volume: 54 start-page: 135 year: 2013 end-page: 139 article-title: Usefulness of double dose contrast‐enhanced magnetic resonance imaging for clear delineation of gross tumor volume in stereotactic radiotherapy treatment planning of metastatic brain tumors: A dose comparison study publication-title: J Radiat Res – volume: 30 start-page: 10 year: 2011 article-title: Brain metastases from solid tumors: Disease outcome according to type of treatment and therapeutic resources of the treating center publication-title: J Exp Clin Cancer Res – start-page: 177 year: 2017 end-page: 182 article-title: Computer‐based segmentation, change detection and quantification for lesions in multiple sclerosis publication-title: IEEE Int Conf Comput Sci Eng – volume: 18 start-page: 1043 year: 2016 end-page: 1065 article-title: Updates in the management of brain metastases publication-title: Neuro Oncol – volume: 12 start-page: e0185844 year: 2017 article-title: A deep convolutional neural network‐based automatic delineation strategy for multiple brain metastases stereotactic radiosurgery publication-title: PLoS One – volume: 95 start-page: 43 year: 2018 end-page: 54 article-title: Automatic detection and segmentation of brain metastases on multimodal MR images with a deep convolutional neural network publication-title: Comput Biol Med – start-page: 234 year: 2015 end-page: 241 article-title: U‐Net: Convolutional networks for biomedical image segmentation publication-title: Int Conf Med Image Comput Comput Interv – ident: e_1_2_6_4_1 doi: 10.1093/neuonc/nox077 – ident: e_1_2_6_3_1 doi: 10.1007/s11912-011-0203-y – ident: e_1_2_6_18_1 doi: 10.1016/j.compbiomed.2018.02.004 – start-page: 6230 year: 2017 ident: e_1_2_6_28_1 article-title: Pyramid scene parsing network publication-title: IEEE Conf Comput Vis Pattern Recognit – start-page: 125 year: 2016 ident: e_1_2_6_12_1 article-title: Deep learning trends for focal brain pathology segmentation in MRI publication-title: Mach Learn Heal Inform – start-page: 306 year: 2015 ident: e_1_2_6_14_1 article-title: Deep feature learning with discrimination mechanism for brain tumor segmentation and diagnosis publication-title: IEEE Int Conf Intell Inf Hiding Multimed Signal Process – ident: e_1_2_6_15_1 doi: 10.1016/j.media.2016.10.004 – ident: e_1_2_6_32_1 doi: 10.1093/jrr/rrs053 – ident: e_1_2_6_24_1 doi: 10.1371/journal.pone.0178265 – start-page: 6980 ident: e_1_2_6_22_1 publication-title: Ba JA. A method for stochastic optimization. arXiv Prepr 2014:arXiv – ident: e_1_2_6_7_1 doi: 10.1016/S1470-2045(15)70057-4 – ident: e_1_2_6_19_1 doi: 10.1088/0031-9155/61/24/8440 – ident: e_1_2_6_13_1 doi: 10.1109/TMI.2016.2538465 – start-page: 936 year: 2017 ident: e_1_2_6_29_1 article-title: Feature pyramid networks for object detection publication-title: IEEE Conf Comput Vis Pattern Recognit – start-page: 1 year: 2015 ident: e_1_2_6_21_1 article-title: Going deeper with convolutions publication-title: IEEE Conf Comput Vis Pattern Recognit – ident: e_1_2_6_6_1 doi: 10.1002/(SICI)1097-0142(19961015)78:8<1781::AID-CNCR19>3.0.CO;2-U – start-page: 770 year: 2016 ident: e_1_2_6_25_1 article-title: Deep residual learning for image recognition publication-title: IEEE Conf Comput Vis Pattern Recognit – ident: e_1_2_6_20_1 doi: 10.1002/hbm.10062 – ident: e_1_2_6_2_1 doi: 10.1093/neuonc/now127 – ident: e_1_2_6_30_1 doi: 10.1007/s10278-015-9839-8 – ident: e_1_2_6_5_1 doi: 10.1186/1756-9966-30-10 – ident: e_1_2_6_16_1 doi: 10.1016/j.media.2017.10.002 – start-page: 177 year: 2017 ident: e_1_2_6_8_1 article-title: Computer‐based segmentation, change detection and quantification for lesions in multiple sclerosis publication-title: IEEE Int Conf Comput Sci Eng – start-page: 174 volume-title: IEEE 14th Int Symp Biomed Imaging ident: e_1_2_6_10_1 – start-page: 234 year: 2015 ident: e_1_2_6_27_1 article-title: U‐Net: Convolutional networks for biomedical image segmentation publication-title: Int Conf Med Image Comput Comput Interv – start-page: 1 year: 2017 ident: e_1_2_6_9_1 article-title: Review of liver segmentation and computer assisted detection/diagnosis methods in computed tomography publication-title: Artif Intell Rev – ident: e_1_2_6_11_1 doi: 10.1007/s10278-017-9983-4 – volume: 39 start-page: 1635 year: 2018 ident: e_1_2_6_33_1 article-title: Evaluation of thick‐slab overlapping MIP images of contrast‐enhanced 3D T1‐weighted CUBE for detection of intracranial metastases: A pilot study for comparison of lesion detection, interpretation time, and sensitivity with nonoverlapping CUBE MIP, CUBE, and inversion‐recovery‐prepared fast‐spoiled gradient recalled brain volume publication-title: Am J Neuroradiol doi: 10.3174/ajnr.A5747 – ident: e_1_2_6_23_1 doi: 10.1016/j.procs.2016.09.407 – start-page: 2261 year: 2017 ident: e_1_2_6_26_1 article-title: Densely connected convolutional networks publication-title: IEEE Conf Comput Vis Pattern Recognit – ident: e_1_2_6_31_1 doi: 10.1118/1.4898200 – ident: e_1_2_6_35_1 doi: 10.1111/cgf.12193 – volume: 12 start-page: e0185844 year: 2017 ident: e_1_2_6_17_1 article-title: A deep convolutional neural network‐based automatic delineation strategy for multiple brain metastases stereotactic radiosurgery publication-title: PLoS One doi: 10.1371/journal.pone.0185844 – ident: e_1_2_6_34_1 doi: 10.1259/bjr.20110718 |
SSID | ssj0009945 |
Score | 2.6541522 |
Snippet | Background
Detecting and segmenting brain metastases is a tedious and time‐consuming task for many radiologists, particularly with the growing use of... Detecting and segmenting brain metastases is a tedious and time-consuming task for many radiologists, particularly with the growing use of multisequence 3D... BackgroundDetecting and segmenting brain metastases is a tedious and time‐consuming task for many radiologists, particularly with the growing use of... |
SourceID | pubmedcentral proquest pubmed crossref wiley |
SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 175 |
SubjectTerms | Adult Aged Artificial neural networks Automation Brain Brain - diagnostic imaging Brain cancer brain metastases Brain Neoplasms - diagnostic imaging Brain Neoplasms - secondary Convolution Deep Learning Female Field strength Ground truth Humans Image Interpretation, Computer-Assisted - methods Image processing Image segmentation Lesions Magnetic resonance imaging Magnetic Resonance Imaging - methods Male Medical imaging Metastases Metastasis Middle Aged multisequence Neural networks Neuroimaging Patients Performance evaluation Population studies Recall Retrospective Studies segmentation Sensitivity and Specificity Statistical analysis Statistical tests Subgroups Toxicity |
Title | Deep learning enables automatic detection and segmentation of brain metastases on multisequence MRI |
URI | https://onlinelibrary.wiley.com/doi/abs/10.1002%2Fjmri.26766 https://www.ncbi.nlm.nih.gov/pubmed/31050074 https://www.proquest.com/docview/2323275564 https://www.proquest.com/docview/2229234895 https://pubmed.ncbi.nlm.nih.gov/PMC7199496 |
Volume | 51 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwEB5VPVRceD8CLTKCC0jZJo5fkXqpgKpUWg4VlXpB0cRxSoHNVk32wq9n7DyWpQgJblEyVhzPjP3FnvkG4FWJyFFrR7OfFbEobRqjq0RsBK19Bo2x0icKzz-q4zNxci7Pt-BgzIXp-SGmDTfvGWG-9g6OZbu_Jg39uri-nHGllefb9sFaHhGdrrmj8jxUKCb8kMWpSfTETcr31003V6MbEPNmpOSvCDYsQUd34PPY-T7y5Nts1ZUz--M3Xsf__bq7cHvApuywN6Z7sOWa-7AzH07fH4B959wVG-pMXDAX0q5ahqtuGYhfWeW6ENnVMGwq1rqLxZDZ1LBlzUpfjYItXIeESFtqSLdDPOMYzs3mpx8ewtnR-09vj-OhSENshTYqFhIzlYtMOcWTGpXVXKHKRC2rNKkqQicKJSIal1qUdUaAwyUkQdoxdZq77BFsN8vGPQFmdZVU9P-oeJ2LtLRoLdeVpyz0vIEqjeD1qKzCDgzmvpDG96LnXuaFH7UijFoELyfZq563449Su6POi8F324IwZsa1lEpE8GJ6TF7nj1KwccsVyXBOyFiYXEbwuDeR6TUEmKVHZhHoDeOZBDyj9-aT5vJLYPbWnqk5p269Cbbxl54XJ6SScPX0X4SfwS3uNwzCHtIubHfXK7dHqKornwfv-QkhaCAi |
linkProvider | Wiley-Blackwell |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwELagSNBLeZdAASO4gJRt4viVI6JU29LtoWql3qKJ7ZS2bLZqspf-esZONstShAS3KB4riWfG_jIef0PIhxKAgVIOZz_DY16aNAZneaw5rn0atDbCHxSeHMrxCd8_Fad9bo4_C9PxQwwBN-8ZYb72Du4D0ttL1tCL6fX5iEkl5V1yz5f09tT5O0dL9qg8DzWKEUFkcaoTNbCTsu1l39X16BbIvJ0r-SuGDYvQ7sOu0moTuAt97snlaN6WI3PzG7Pjf3_fI7LRw1P6ubOnx-SOq5-Q-5N-A_4pMTvOXdG-1MQZdeHkVUNh3s4C9yu1rg3JXTWF2tLGnU37w001nVW09AUp6NS1gKC0wY54O6Q0LjK66eRo7xk52f16_GUc93UaYsOVljEXkMmcZ9JJllQgjWISZMYrYdPEWgQoEgQAaJcaEFWGmMMlKIHq0VWau-w5WatntXtBqFE2sfgLKVmV87Q0YAxT1rMWeupAmUbk40JbhelJzH0tjR9FR7_MCj9qRRi1iLwfZK866o4_Sm0tlF707tsUCDMzpoSQPCLvhmZ0PL-bArWbzVGGMQTHXOciIpudjQyPQcwsPDiLiFqxnkHAk3qvttTn3wO5t_JkzTm-1qdgHH9582IfVRKuXv6L8FvyYHw8OSgO9g6_vSLrzMcPQkhpi6y113P3GkFWW74JrvQTBAokPg |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwEB6VIlVceD8CBYzgAlK2ieNXJC6IZdUWtkIVlXpBkWM7pcBmV93shV_P2HksSxES3KJkrDieGfuLPfMNwItSa6qldDj7GRaz0qSxdpbFiuHap7RShvtE4emR2D9hh6f8dAte97kwLT_EsOHmPSPM197BF7baW5OGfp1dnI-okEJcgatMJLkv3DA-XpNH5XkoUYwAIotTlciBnJTurdtuLkeXMOblUMlfIWxYgyY34HPf-zb05Nto1ZQj8-M3Ysf__bybcL0Dp-RNa023YMvVt2Fn2h2_3wEzdm5BukITZ8SFvKsl0atmHphfiXVNCO2qia4tWbqzWZfaVJN5RUpfjoLMXKMRki6xId4OAY19PDeZHh_chZPJu09v9-OuSkNsmFQiZlxnImeZcIImlRZGUqFFxipu08RahCdCc621cqnRvMoQcbgEJVA7qkpzl92D7XpeuwdAjLSJxR9IQaucpaXRxlBpPWehJw4UaQQve2UVpqMw95U0vhct-TIt_KgVYdQieD7ILlrijj9K7fY6LzrnXRYIMjMqORcsgmfDY3Q7f5aiazdfoQylCI2ZynkE91sTGV6DiJl7aBaB3DCeQcBTem8-qc-_BGpv6amac-zWq2Abf-l5cYgqCVcP_0X4Kex8HE-KDwdH7x_BNeo3D8J-0i5sNxcr9xgRVlM-CY70E4ZWIu0 |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deep+learning+enables+automatic+detection+and+segmentation+of+brain+metastases+on+multisequence+MRI&rft.jtitle=Journal+of+magnetic+resonance+imaging&rft.au=Gr%C3%B8vik%2C+Endre&rft.au=Yi%2C+Darvin&rft.au=Iv%2C+Michael&rft.au=Tong%2C+Elizabeth&rft.date=2020-01-01&rft.issn=1053-1807&rft.eissn=1522-2586&rft.volume=51&rft.issue=1&rft.spage=175&rft.epage=182&rft_id=info:doi/10.1002%2Fjmri.26766&rft.externalDBID=n%2Fa&rft.externalDocID=10_1002_jmri_26766 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1053-1807&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1053-1807&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1053-1807&client=summon |