Improving Computer-Aided Detection Using Convolutional Neural Networks and Random View Aggregation

Automated computer-aided detection (CADe) has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities at the cost of high false-positives (FP) per patient rates. We design a two-tiered coarse-to-fine cascade framework that first operates a can...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on medical imaging Vol. 35; no. 5; pp. 1170 - 1181
Main Authors Roth, Holger R., Le Lu, Jiamin Liu, Jianhua Yao, Seff, Ari, Cherry, Kevin, Kim, Lauren, Summers, Ronald M.
Format Journal Article
LanguageEnglish
Published United States IEEE 01.05.2016
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Automated computer-aided detection (CADe) has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities at the cost of high false-positives (FP) per patient rates. We design a two-tiered coarse-to-fine cascade framework that first operates a candidate generation system at sensitivities ~ 100% of but at high FP levels. By leveraging existing CADe systems, coordinates of regions or volumes of interest (ROI or VOI) are generated and function as input for a second tier, which is our focus in this study. In this second stage, we generate 2D (two-dimensional) or 2.5D views via sampling through scale transformations, random translations and rotations. These random views are used to train deep convolutional neural network (ConvNet) classifiers. In testing, the ConvNets assign class (e.g., lesion, pathology) probabilities for a new set of random views that are then averaged to compute a final per-candidate classification probability. This second tier behaves as a highly selective process to reject difficult false positives while preserving high sensitivities. The methods are evaluated on three data sets: 59 patients for sclerotic metastasis detection, 176 patients for lymph node detection, and 1,186 patients for colonic polyp detection. Experimental results show the ability of ConvNets to generalize well to different medical imaging CADe applications and scale elegantly to various data sets. Our proposed methods improve performance markedly in all cases. Sensitivities improved from 57% to 70%, 43% to 77%, and 58% to 75% at 3 FPs per patient for sclerotic metastases, lymph nodes and colonic polyps, respectively.
AbstractList Automated computer-aided detection (CADe) has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities at the cost of high false-positives (FP) per patient rates. We design a two-tiered coarse-to-fine cascade framework that first operates a candidate generation system at sensitivities  ∼ 100% of but at high FP levels. By leveraging existing CADe systems, coordinates of regions or volumes of interest (ROI or VOI) are generated and function as input for a second tier, which is our focus in this study. In this second stage, we generate 2D (two-dimensional) or 2.5D views via sampling through scale transformations, random translations and rotations. These random views are used to train deep convolutional neural network (ConvNet) classifiers. In testing, the ConvNets assign class (e.g., lesion, pathology) probabilities for a new set of random views that are then averaged to compute a final per-candidate classification probability. This second tier behaves as a highly selective process to reject difficult false positives while preserving high sensitivities. The methods are evaluated on three data sets: 59 patients for sclerotic metastasis detection, 176 patients for lymph node detection, and 1,186 patients for colonic polyp detection. Experimental results show the ability of ConvNets to generalize well to different medical imaging CADe applications and scale elegantly to various data sets. Our proposed methods improve performance markedly in all cases. Sensitivities improved from 57% to 70%, 43% to 77%, and 58% to 75% at 3 FPs per patient for sclerotic metastases, lymph nodes and colonic polyps, respectively.
Automated computer-aided detection (CADe) has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities at the cost of high false-positives (FP) per patient rates. We design a two-tiered coarse-to-fine cascade framework that first operates a candidate generation system at sensitivities ∼ 100% of but at high FP levels. By leveraging existing CADe systems, coordinates of regions or volumes of interest (ROI or VOI) are generated and function as input for a second tier, which is our focus in this study. In this second stage, we generate 2D (two-dimensional) or 2.5D views via sampling through scale transformations, random translations and rotations. These random views are used to train deep convolutional neural network (ConvNet) classifiers. In testing, the ConvNets assign class (e.g., lesion, pathology) probabilities for a new set of random views that are then averaged to compute a final per-candidate classification probability. This second tier behaves as a highly selective process to reject difficult false positives while preserving high sensitivities. The methods are evaluated on three data sets: 59 patients for sclerotic metastasis detection, 176 patients for lymph node detection, and 1,186 patients for colonic polyp detection. Experimental results show the ability of ConvNets to generalize well to different medical imaging CADe applications and scale elegantly to various data sets. Our proposed methods improve performance markedly in all cases. Sensitivities improved from 57% to 70%, 43% to 77%, and 58% to 75% at 3 FPs per patient for sclerotic metastases, lymph nodes and colonic polyps, respectively.Automated computer-aided detection (CADe) has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities at the cost of high false-positives (FP) per patient rates. We design a two-tiered coarse-to-fine cascade framework that first operates a candidate generation system at sensitivities ∼ 100% of but at high FP levels. By leveraging existing CADe systems, coordinates of regions or volumes of interest (ROI or VOI) are generated and function as input for a second tier, which is our focus in this study. In this second stage, we generate 2D (two-dimensional) or 2.5D views via sampling through scale transformations, random translations and rotations. These random views are used to train deep convolutional neural network (ConvNet) classifiers. In testing, the ConvNets assign class (e.g., lesion, pathology) probabilities for a new set of random views that are then averaged to compute a final per-candidate classification probability. This second tier behaves as a highly selective process to reject difficult false positives while preserving high sensitivities. The methods are evaluated on three data sets: 59 patients for sclerotic metastasis detection, 176 patients for lymph node detection, and 1,186 patients for colonic polyp detection. Experimental results show the ability of ConvNets to generalize well to different medical imaging CADe applications and scale elegantly to various data sets. Our proposed methods improve performance markedly in all cases. Sensitivities improved from 57% to 70%, 43% to 77%, and 58% to 75% at 3 FPs per patient for sclerotic metastases, lymph nodes and colonic polyps, respectively.
Automated computer-aided detection (CADe) has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities at the cost of high false-positives (FP) per patient rates. We design a two-tiered coarse-to-fine cascade framework that first operates a candidate generation system at sensitivities ~ 100% of but at high FP levels. By leveraging existing CADe systems, coordinates of regions or volumes of interest (ROI or VOI) are generated and function as input for a second tier, which is our focus in this study. In this second stage, we generate 2D (two-dimensional) or 2.5D views via sampling through scale transformations, random translations and rotations. These random views are used to train deep convolutional neural network (ConvNet) classifiers. In testing, the ConvNets assign class (e.g., lesion, pathology) probabilities for a new set of random views that are then averaged to compute a final per-candidate classification probability. This second tier behaves as a highly selective process to reject difficult false positives while preserving high sensitivities. The methods are evaluated on three data sets: 59 patients for sclerotic metastasis detection, 176 patients for lymph node detection, and 1,186 patients for colonic polyp detection. Experimental results show the ability of ConvNets to generalize well to different medical imaging CADe applications and scale elegantly to various data sets. Our proposed methods improve performance markedly in all cases. Sensitivities improved from 57% to 70%, 43% to 77%, and 58% to 75% at 3 FPs per patient for sclerotic metastases, lymph nodes and colonic polyps, respectively.
Automated computer-aided detection (CADe) in medical imaging has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities but at the cost of high false-positives (FP) per patient rates. We design a two-tiered coarse-to-fine cascade framework that first operates a candidate generation system at sensitivities of ~100% but at high FP levels. By leveraging existing CAD systems, coordinates of regions or volumes of interest (ROI or VOI) for lesion candidates are generated in this step and function as input for a second tier, which is our focus in this study. In this second stage, we generate N 2D (two-dimensional) or 2.5D views via sampling through scale transformations, random translations and rotations with respect to each ROI’s centroid coordinates. These random views are used to train deep convolutional neural network (ConvNet) classifiers. In testing, the trained ConvNets are employed to assign class (e.g., lesion, pathology) probabilities for a new set of N random views that are then averaged at each ROI to compute a final per-candidate classification probability. This second tier behaves as a highly selective process to reject difficult false positives while preserving high sensitivities. The methods are evaluated on three different data sets with different numbers of patients: 59 patients for sclerotic metastases detection, 176 patients for lymph node detection, and 1,186 patients for colonic polyp detection. Experimental results show the ability of ConvNets to generalize well to different medical imaging CADe applications and scale elegantly to various data sets. Our proposed methods improve CADe performance markedly in all cases. CADe sensitivities improved from 57% to 70%, from 43% to 77% and from 58% to 75% at 3 FPs per patient for sclerotic metastases, lymph nodes and colonic polyps, respectively.
Automated computer-aided detection (CADe) has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities at the cost of high false-positives (FP) per patient rates. We design a two-tiered coarse-to-fine cascade framework that first operates a candidate generation system at sensitivities [Formula Omitted] of but at high FP levels. By leveraging existing CADe systems, coordinates of regions or volumes of interest (ROI or VOI) are generated and function as input for a second tier, which is our focus in this study. In this second stage, we generate 2D (two-dimensional) or 2.5D views via sampling through scale transformations, random translations and rotations. These random views are used to train deep convolutional neural network (ConvNet) classifiers. In testing, the ConvNets assign class (e.g., lesion, pathology) probabilities for a new set of random views that are then averaged to compute a final per-candidate classification probability. This second tier behaves as a highly selective process to reject difficult false positives while preserving high sensitivities. The methods are evaluated on three data sets: 59 patients for sclerotic metastasis detection, 176 patients for lymph node detection, and 1,186 patients for colonic polyp detection. Experimental results show the ability of ConvNets to generalize well to different medical imaging CADe applications and scale elegantly to various data sets. Our proposed methods improve performance markedly in all cases. Sensitivities improved from 57% to 70%, 43% to 77%, and 58% to 75% at 3 FPs per patient for sclerotic metastases, lymph nodes and colonic polyps, respectively.
Author Kim, Lauren
Jiamin Liu
Roth, Holger R.
Summers, Ronald M.
Le Lu
Seff, Ari
Cherry, Kevin
Jianhua Yao
Author_xml – sequence: 1
  givenname: Holger R.
  surname: Roth
  fullname: Roth, Holger R.
  email: holger.roth@nih.gov
  organization: Radiol. & Imaging Sci. Dept., Nat. Inst. of Health Clinical Center, Bethesda, MD, USA
– sequence: 2
  surname: Le Lu
  fullname: Le Lu
  organization: Radiol. & Imaging Sci. Dept., Nat. Inst. of Health Clinical Center, Bethesda, MD, USA
– sequence: 3
  surname: Jiamin Liu
  fullname: Jiamin Liu
  organization: Radiol. & Imaging Sci. Dept., Nat. Inst. of Health Clinical Center, Bethesda, MD, USA
– sequence: 4
  surname: Jianhua Yao
  fullname: Jianhua Yao
  organization: Radiol. & Imaging Sci. Dept., Nat. Inst. of Health Clinical Center, Bethesda, MD, USA
– sequence: 5
  givenname: Ari
  surname: Seff
  fullname: Seff, Ari
  organization: Radiol. & Imaging Sci. Dept., Nat. Inst. of Health Clinical Center, Bethesda, MD, USA
– sequence: 6
  givenname: Kevin
  surname: Cherry
  fullname: Cherry, Kevin
  organization: Radiol. & Imaging Sci. Dept., Nat. Inst. of Health Clinical Center, Bethesda, MD, USA
– sequence: 7
  givenname: Lauren
  surname: Kim
  fullname: Kim, Lauren
  organization: Radiol. & Imaging Sci. Dept., Nat. Inst. of Health Clinical Center, Bethesda, MD, USA
– sequence: 8
  givenname: Ronald M.
  surname: Summers
  fullname: Summers, Ronald M.
  organization: Radiol. & Imaging Sci. Dept., Nat. Inst. of Health Clinical Center, Bethesda, MD, USA
BackLink https://www.ncbi.nlm.nih.gov/pubmed/26441412$$D View this record in MEDLINE/PubMed
BookMark eNp9UU1v1DAQtVAR3RbuSEgoEhcuWcZf-bggrbalXamAhFrEzXLiSXBJ4sVOtuLf4_1gBT1w8Vie98Zv3jsjJ4MbkJCXFOaUQvnu9uNqzoDKORMFKxk8ITMqZZEyKb6dkBmwvEgBMnZKzkK4B6BCQvmMnLJMCCoom5Fq1a-929ihTZauX08j-nRhDZrkAkesR-uG5C7s28PGddP2RXfJJ5z8rowPzv8IiR5M8iUerk--WnxIFm3rsdVb9HPytNFdwBeHek7uPlzeLq_Tm89Xq-XiJq0FsDE1lTF5SUWeZWAyRNHUTCCtaol51vBGa6BaiIZyUwrMgIuyriod7ybuCcjPyfv93PVU9WhqHMYoUa297bX_pZy26t_OYL-r1m1UzgVwLuKAt4cB3v2cMIyqt6HGrtMDuikomheFFJBDGaFvHkHv3eSjMTtUXsioj0fU678VHaX8sT8CYA-ovQvBY3OEUFDbhFVMWG0TVoeEIyV7RKntuPM57mS7_xFf7YkWEY__5Cx6LjP-G8uytEg
CODEN ITMID4
CitedBy_id crossref_primary_10_1007_s00521_022_06973_4
crossref_primary_10_1111_den_13670
crossref_primary_10_1109_ACCESS_2019_2950286
crossref_primary_10_1016_j_cmpb_2020_105361
crossref_primary_10_1109_ACCESS_2019_2937993
crossref_primary_10_1155_2018_2391925
crossref_primary_10_1109_TBME_2018_2881952
crossref_primary_10_1109_ACCESS_2019_2960411
crossref_primary_10_1117_1_JEI_28_4_043023
crossref_primary_10_5114_pjr_2021_108257
crossref_primary_10_1109_ACCESS_2017_2788044
crossref_primary_10_1186_s13244_019_0832_5
crossref_primary_10_1016_j_media_2018_07_008
crossref_primary_10_1016_j_media_2018_03_002
crossref_primary_10_1016_j_media_2019_101541
crossref_primary_10_1097_BRS_0000000000004308
crossref_primary_10_1109_ACCESS_2020_2994762
crossref_primary_10_1109_ACCESS_2019_2920980
crossref_primary_10_1016_j_patrec_2019_08_003
crossref_primary_10_1155_2016_6584725
crossref_primary_10_1002_jmri_26534
crossref_primary_10_1016_j_compmedimag_2019_04_004
crossref_primary_10_1136_postgradmedj_2020_139620
crossref_primary_10_1016_j_jiph_2020_06_033
crossref_primary_10_1109_ACCESS_2021_3075294
crossref_primary_10_1080_03772063_2021_1905562
crossref_primary_10_1186_s13018_023_04509_7
crossref_primary_10_3390_jimaging8040097
crossref_primary_10_1177_1533033818802789
crossref_primary_10_1002_ima_22642
crossref_primary_10_1016_j_compbiomed_2023_107617
crossref_primary_10_1109_TMI_2020_2980839
crossref_primary_10_3390_diagnostics12092102
crossref_primary_10_1016_j_compmedimag_2022_102108
crossref_primary_10_1088_1742_6596_1815_1_012005
crossref_primary_10_1007_s00371_023_03030_6
crossref_primary_10_1016_j_bspc_2022_103501
crossref_primary_10_1088_1361_6560_ad965c
crossref_primary_10_7717_peerj_6201
crossref_primary_10_1016_j_cmpb_2022_107095
crossref_primary_10_1016_j_eswa_2019_112957
crossref_primary_10_1109_ACCESS_2018_2846685
crossref_primary_10_1142_S0219843619500154
crossref_primary_10_1371_journal_pone_0206229
crossref_primary_10_1007_s13177_020_00225_2
crossref_primary_10_1016_j_neucom_2019_09_027
crossref_primary_10_1016_j_ejmp_2020_01_011
crossref_primary_10_1016_j_patcog_2018_02_026
crossref_primary_10_32604_cmes_2023_030236
crossref_primary_10_3390_en15249593
crossref_primary_10_1186_s13014_023_02311_7
crossref_primary_10_1007_s00521_021_06182_5
crossref_primary_10_1117_1_JEI_27_6_063006
crossref_primary_10_1007_s11042_022_12908_x
crossref_primary_10_7717_peerj_8052
crossref_primary_10_1080_02564602_2019_1576550
crossref_primary_10_1109_MMUL_2018_2875861
crossref_primary_10_35713_aic_v3_i1_11
crossref_primary_10_1002_lio2_1008
crossref_primary_10_1117_1_JMI_5_3_036501
crossref_primary_10_1038_s41598_019_56989_5
crossref_primary_10_1002_ima_22517
crossref_primary_10_1088_1361_6560_ac35c7
crossref_primary_10_1016_j_jflm_2020_102066
crossref_primary_10_1016_j_neunet_2019_03_003
crossref_primary_10_1016_j_patcog_2017_08_004
crossref_primary_10_1038_s41598_017_13448_3
crossref_primary_10_1097_BRS_0000000000004889
crossref_primary_10_1109_ACCESS_2022_3160290
crossref_primary_10_1371_journal_pone_0285608
crossref_primary_10_1016_j_heliyon_2023_e13444
crossref_primary_10_1109_JBHI_2019_2940695
crossref_primary_10_35377_saucis_02_01_538249
crossref_primary_10_1007_s11042_023_17210_y
crossref_primary_10_1016_j_media_2022_102550
crossref_primary_10_32604_cmes_2021_016817
crossref_primary_10_1016_j_media_2020_101896
crossref_primary_10_1109_TMI_2021_3112923
crossref_primary_10_32084_tekapr_2021_14_1_18
crossref_primary_10_3389_fonc_2021_805911
crossref_primary_10_1016_j_compbiomed_2024_109302
crossref_primary_10_1016_j_ins_2019_05_043
crossref_primary_10_1007_s10586_024_04532_1
crossref_primary_10_4103_ijpvm_ijpvm_373_22
crossref_primary_10_3390_app9163261
crossref_primary_10_1007_s10916_018_1072_9
crossref_primary_10_1109_TNNLS_2018_2790388
crossref_primary_10_6009_jjrt_2021_JSRT_77_8_787
crossref_primary_10_1109_ACCESS_2020_3042889
crossref_primary_10_1088_1361_6560_ab326a
crossref_primary_10_1007_s12559_017_9487_z
crossref_primary_10_1109_ACCESS_2022_3183604
crossref_primary_10_1155_2020_6687733
crossref_primary_10_1111_cas_14377
crossref_primary_10_12677_JISP_2018_71001
crossref_primary_10_1109_JBHI_2018_2818620
crossref_primary_10_1016_j_media_2022_102421
crossref_primary_10_1088_1361_6560_aa82ec
crossref_primary_10_1109_ACCESS_2019_2906116
crossref_primary_10_1016_j_aap_2020_105716
crossref_primary_10_1016_j_csbj_2021_09_001
crossref_primary_10_1016_j_neunet_2020_09_004
crossref_primary_10_2174_1573405614666181017122109
crossref_primary_10_1007_s00521_017_3138_x
crossref_primary_10_1016_j_drudis_2019_07_006
crossref_primary_10_1007_s12553_024_00879_y
crossref_primary_10_1016_j_media_2020_101884
crossref_primary_10_1016_j_ultras_2016_08_004
crossref_primary_10_1109_TMI_2019_2936244
crossref_primary_10_1148_radiol_2019190372
crossref_primary_10_1109_TMI_2020_3031289
crossref_primary_10_1002_mp_13642
crossref_primary_10_1111_exsy_12882
crossref_primary_10_1002_mp_13886
crossref_primary_10_3390_s21186187
crossref_primary_10_1016_j_mex_2024_103034
crossref_primary_10_1021_acs_jpca_8b09376
crossref_primary_10_1016_j_bbe_2021_01_005
crossref_primary_10_1109_JBHI_2020_3017540
crossref_primary_10_1177_0161734621989598
crossref_primary_10_3390_app132111898
crossref_primary_10_1038_s41598_017_08040_8
crossref_primary_10_1259_dmfr_20200375
crossref_primary_10_1109_ACCESS_2019_2896409
crossref_primary_10_1109_TMI_2016_2553401
crossref_primary_10_1117_1_JMI_4_4_041304
crossref_primary_10_1186_s13550_017_0260_9
crossref_primary_10_1016_j_cmpb_2018_01_017
crossref_primary_10_3748_wjg_v27_i21_2681
crossref_primary_10_4015_S1016237222500119
crossref_primary_10_1002_mp_15010
crossref_primary_10_3390_s19194251
crossref_primary_10_1002_ima_22287
crossref_primary_10_1109_ACCESS_2024_3396999
crossref_primary_10_1186_s12968_018_0516_1
crossref_primary_10_1007_s00521_020_04787_w
crossref_primary_10_1007_s11548_020_02275_z
crossref_primary_10_1109_TMI_2022_3171418
crossref_primary_10_3390_electronics9081237
crossref_primary_10_1109_RBME_2021_3075500
crossref_primary_10_1109_ACCESS_2020_2997946
crossref_primary_10_1097_RLI_0000000000000600
crossref_primary_10_1002_cncr_31630
crossref_primary_10_1109_TCYB_2017_2671898
crossref_primary_10_1016_j_neunet_2020_05_003
crossref_primary_10_3390_cancers14082008
crossref_primary_10_1016_j_ejmp_2021_02_024
crossref_primary_10_1007_s00138_020_01074_5
crossref_primary_10_1038_srep24454
crossref_primary_10_1109_ACCESS_2018_2874803
crossref_primary_10_1111_exsy_12789
crossref_primary_10_1109_JBHI_2018_2879449
crossref_primary_10_31127_tuje_652358
crossref_primary_10_1109_JBHI_2017_2705583
crossref_primary_10_1016_j_cmpb_2022_107201
crossref_primary_10_1002_jemt_23773
crossref_primary_10_1016_j_compeleceng_2019_106449
crossref_primary_10_1016_j_asoc_2024_111544
crossref_primary_10_1016_j_compbiomed_2022_106059
crossref_primary_10_1109_JPROC_2021_3054390
crossref_primary_10_1007_s00256_021_03873_x
crossref_primary_10_1016_j_neucom_2018_09_013
crossref_primary_10_1089_sur_2019_154
crossref_primary_10_1016_j_compbiomed_2016_11_003
crossref_primary_10_1109_JBHI_2018_2808199
crossref_primary_10_1002_mp_15033
crossref_primary_10_1109_TMI_2020_3021493
crossref_primary_10_3233_XST_200656
crossref_primary_10_1002_cnm_3601
crossref_primary_10_3390_math12070958
crossref_primary_10_3348_jksr_2020_0158
crossref_primary_10_17341_gazimmfd_416530
crossref_primary_10_1088_1361_6560_ac5297
crossref_primary_10_3389_fnins_2017_00310
crossref_primary_10_1098_rsif_2017_0387
crossref_primary_10_3233_XST_17302
crossref_primary_10_3390_healthcare10030422
crossref_primary_10_1142_S021951942340081X
crossref_primary_10_1109_ACCESS_2022_3202560
crossref_primary_10_1088_1742_6596_2089_1_012058
crossref_primary_10_1055_a_2157_6670
crossref_primary_10_3390_app132312725
crossref_primary_10_1016_j_media_2020_101840
crossref_primary_10_1038_s41746_020_00353_9
crossref_primary_10_1093_jamia_ocab046
crossref_primary_10_1002_ima_22371
crossref_primary_10_1109_TCSVT_2024_3427645
crossref_primary_10_3390_bioengineering10080948
crossref_primary_10_1142_S0219691321500545
crossref_primary_10_1109_TASE_2019_2936645
crossref_primary_10_1016_j_bspc_2020_101952
crossref_primary_10_1016_j_neucom_2021_01_135
crossref_primary_10_1164_rccm_201705_0860OC
crossref_primary_10_1109_JBHI_2019_2894713
crossref_primary_10_2174_1573405615666190716122040
crossref_primary_10_1007_s00521_022_06928_9
crossref_primary_10_1016_j_adhoc_2021_102438
crossref_primary_10_1016_j_media_2017_07_005
crossref_primary_10_1016_j_media_2017_03_002
crossref_primary_10_3390_jimaging10100239
crossref_primary_10_1016_j_compbiomed_2018_05_018
crossref_primary_10_1016_j_rpor_2020_03_015
crossref_primary_10_1155_2021_1994764
crossref_primary_10_1038_s41598_021_84287_6
crossref_primary_10_1088_1361_6560_ac4667
crossref_primary_10_1016_j_patrec_2022_03_022
crossref_primary_10_1109_TIE_2019_2942548
crossref_primary_10_1002_jemt_24008
crossref_primary_10_1109_TBME_2016_2631245
crossref_primary_10_1117_1_JMI_4_4_044501
crossref_primary_10_1002_mp_14915
crossref_primary_10_1038_s41598_017_12320_8
crossref_primary_10_1016_j_compmedimag_2017_06_006
crossref_primary_10_3389_fcvm_2023_1173769
crossref_primary_10_1186_s12920_018_0416_0
crossref_primary_10_35940_ijrte_C6386_0910321
crossref_primary_10_1109_TMI_2020_2975853
crossref_primary_10_1109_ACCESS_2020_2974617
crossref_primary_10_1002_mp_14397
crossref_primary_10_1109_ACCESS_2019_2916557
crossref_primary_10_32604_cmc_2021_016536
crossref_primary_10_1016_j_compmedimag_2022_102050
crossref_primary_10_1186_s12968_021_00824_2
crossref_primary_10_3174_ajnr_A6883
crossref_primary_10_1155_2020_4864835
crossref_primary_10_1007_s10278_018_0160_1
crossref_primary_10_1088_1361_6560_abcd17
crossref_primary_10_1089_thy_2018_0082
crossref_primary_10_1364_BOE_9_003049
crossref_primary_10_1002_spe_2878
crossref_primary_10_1007_s10462_021_10074_4
crossref_primary_10_1016_j_media_2018_10_010
crossref_primary_10_1007_s00330_022_08741_3
crossref_primary_10_1088_1361_6560_ac944d
crossref_primary_10_2174_1573405616666201217112521
crossref_primary_10_1016_j_jestch_2022_101214
crossref_primary_10_1088_1361_6560_aaf241
crossref_primary_10_1002_ima_22581
crossref_primary_10_3233_JIFS_219312
crossref_primary_10_1016_j_sciaf_2023_e01649
crossref_primary_10_1016_j_neucom_2016_06_077
crossref_primary_10_1007_s00464_022_09470_w
crossref_primary_10_1109_TIFS_2021_3059340
crossref_primary_10_1016_j_media_2021_102082
crossref_primary_10_1109_ACCESS_2019_2907040
crossref_primary_10_1007_s00138_022_01347_1
crossref_primary_10_1016_j_jbi_2021_103803
crossref_primary_10_1007_s00521_021_06368_x
crossref_primary_10_1016_j_artmed_2020_101881
crossref_primary_10_1146_annurev_bioeng_071516_044442
crossref_primary_10_1016_j_media_2018_01_006
crossref_primary_10_1007_s10462_019_09788_3
crossref_primary_10_1148_rg_2017170077
crossref_primary_10_3390_cancers14020277
crossref_primary_10_1016_j_gmod_2019_101031
crossref_primary_10_3233_IDT_190083
crossref_primary_10_1109_TMI_2020_3047598
crossref_primary_10_1155_2022_2472988
crossref_primary_10_1007_s11548_021_02478_y
crossref_primary_10_1016_j_cpet_2021_09_010
crossref_primary_10_1109_JBHI_2022_3161466
crossref_primary_10_1186_s12968_020_00678_0
crossref_primary_10_2174_1573405616666210108122048
crossref_primary_10_3390_jimaging7090190
crossref_primary_10_1049_iet_ipr_2016_0526
crossref_primary_10_1016_j_jksuci_2019_11_013
crossref_primary_10_1200_EDBK_350652
crossref_primary_10_1103_PhysRevX_11_011052
crossref_primary_10_1002_mp_12155
crossref_primary_10_1007_s12272_019_01162_9
crossref_primary_10_1088_1742_6596_1831_1_012006
crossref_primary_10_1038_s41746_021_00411_w
crossref_primary_10_1109_ACCESS_2021_3137317
crossref_primary_10_1155_2022_8616535
crossref_primary_10_1364_BOE_8_002732
crossref_primary_10_1007_s00521_022_06960_9
crossref_primary_10_1142_S2196888823500057
crossref_primary_10_1016_j_ajo_2018_10_007
crossref_primary_10_1016_j_media_2023_102988
crossref_primary_10_1016_j_bspc_2024_106667
crossref_primary_10_2478_amns_2023_2_00504
crossref_primary_10_1109_ACCESS_2020_3029907
crossref_primary_10_1016_j_ceh_2022_04_001
crossref_primary_10_1016_j_ejmp_2020_02_012
crossref_primary_10_1016_j_bbe_2022_06_003
crossref_primary_10_3174_ajnr_A5667
crossref_primary_10_1038_s41568_020_00327_9
crossref_primary_10_1007_s11042_022_12376_3
crossref_primary_10_1109_JBHI_2019_2912659
crossref_primary_10_3389_fonc_2021_588010
crossref_primary_10_1016_j_media_2022_102444
crossref_primary_10_1186_s12880_021_00703_3
crossref_primary_10_1109_TPAMI_2020_3033990
crossref_primary_10_1016_j_media_2022_102562
crossref_primary_10_3390_electronics11010055
crossref_primary_10_1002_int_22452
crossref_primary_10_1055_a_1718_4128
crossref_primary_10_1109_TMI_2017_2774044
crossref_primary_10_1017_S143192761700558X
crossref_primary_10_1002_ima_22430
crossref_primary_10_1002_mp_13264
crossref_primary_10_1088_1742_6596_1706_1_012189
crossref_primary_10_1109_TMI_2019_2937271
crossref_primary_10_1016_j_media_2020_101909
crossref_primary_10_1016_j_jncc_2022_09_003
crossref_primary_10_1159_000540728
crossref_primary_10_1049_iet_ipr_2020_1048
crossref_primary_10_1109_JBHI_2016_2636665
crossref_primary_10_2214_AJR_17_18754
crossref_primary_10_26634_jcom_11_2_20132
crossref_primary_10_1007_s00330_020_06663_6
crossref_primary_10_1007_s11760_020_01820_2
crossref_primary_10_1016_j_jacr_2017_12_028
crossref_primary_10_3390_cancers15071958
crossref_primary_10_1111_coin_12476
crossref_primary_10_1109_ACCESS_2019_2925863
crossref_primary_10_4018_IJCAC_2022010109
crossref_primary_10_1016_j_cmpb_2020_105611
crossref_primary_10_1063_5_0025462
crossref_primary_10_1007_s11042_017_5449_4
crossref_primary_10_1088_1555_6611_abb596
crossref_primary_10_1016_j_patcog_2018_05_026
crossref_primary_10_1016_j_dsp_2024_104411
crossref_primary_10_1016_j_jstrokecerebrovasdis_2022_106382
crossref_primary_10_1016_j_media_2018_09_004
crossref_primary_10_1142_S0219519422400322
crossref_primary_10_2147_TCRM_S483907
crossref_primary_10_3390_s24030928
crossref_primary_10_1088_1742_6596_1505_1_012018
crossref_primary_10_1016_j_isci_2022_104277
crossref_primary_10_1016_j_isprsjprs_2019_10_011
crossref_primary_10_1111_nan_12770
crossref_primary_10_1177_02841851241263066
crossref_primary_10_1109_TMI_2018_2837002
crossref_primary_10_1007_s11548_021_02335_y
crossref_primary_10_1109_TEM_2021_3073018
crossref_primary_10_1007_s11548_018_1835_2
crossref_primary_10_1016_j_media_2022_102463
Cites_doi 10.1162/neco.1989.1.4.541
10.1117/12.2008282
10.1109/TMI.2007.892510
10.1109/CVPR.2014.81
10.1007/978-3-319-10404-1_65
10.1109/CVPR.2011.5995359
10.1007/978-3-319-19992-4_46
10.1053/j.gastro.2005.08.054
10.1145/1390156.1390258
10.1186/1475-925X-13-41
10.1148/radiol.2252011619
10.1056/NEJMoa0800996
10.1007/978-3-319-14104-6_16
10.1148/radiol.13121351
10.1109/CVPR.2014.223
10.1109/TMI.2011.2168234
10.1016/j.patcog.2008.09.034
10.1162/neco.2009.10-08-881
10.1109/ISBI.2015.7163869
10.1016/S0031-3203(03)00192-4
10.1117/12.652288
10.3390/a3010021
10.1117/12.811101
10.1109/ICARCV.2014.7064414
10.1117/12.911700
10.1007/978-3-540-88693-8_34
10.1007/978-3-642-04271-3_122
10.1016/j.media.2012.11.001
10.1007/s00330-013-2774-5
10.1109/TMI.2009.2028576
10.1145/2063576.2064004
10.1016/j.beem.2008.01.011
10.1109/42.974920
10.1038/505146a
10.1109/CVPR.2010.5540008
10.1007/11866763_57
10.1145/1961189.1961199
10.1109/ISBI.2011.5872376
10.1007/978-3-319-14148-0_1
10.1117/12.594547
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2016
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2016
DBID 97E
RIA
RIE
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
5PM
DOI 10.1109/TMI.2015.2482920
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Ceramic Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Nursing & Allied Health Premium
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Materials Research Database
Civil Engineering Abstracts
Aluminium Industry Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Ceramic Abstracts
Materials Business File
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Aerospace Database
Nursing & Allied Health Premium
Engineered Materials Abstracts
Biotechnology Research Abstracts
Solid State and Superconductivity Abstracts
Engineering Research Database
Corrosion Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList MEDLINE
MEDLINE - Academic


Materials Research Database
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Engineering
EISSN 1558-254X
EndPage 1181
ExternalDocumentID PMC7340334
4050811741
26441412
10_1109_TMI_2015_2482920
7279156
Genre orig-research
Journal Article
GrantInformation_xml – fundername: Intramural NIH HHS
  grantid: Z01 CL040004
– fundername: Intramural NIH HHS
  grantid: Z01 CL040003
GroupedDBID ---
-DZ
-~X
.GJ
0R~
29I
4.4
53G
5GY
5RE
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
ACPRK
AENEX
AETIX
AFRAH
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IBMZZ
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNS
RXW
TAE
TN5
VH1
AAYOK
AAYXX
CITATION
RIG
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
5PM
ID FETCH-LOGICAL-c402t-dbdd79147660d6ee4fc24e1bc5e76f3faa01a44f13d94e60349cbba94ed0270e3
IEDL.DBID RIE
ISSN 0278-0062
1558-254X
IngestDate Thu Aug 21 14:02:12 EDT 2025
Fri Jul 11 04:17:48 EDT 2025
Mon Jun 30 06:20:02 EDT 2025
Mon Jul 21 05:58:55 EDT 2025
Tue Jul 01 03:15:56 EDT 2025
Thu Apr 24 22:52:55 EDT 2025
Tue Aug 26 16:42:53 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 5
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/USG.html
Personal use of this material is permitted. However, permission to use this material for any other purposes must be obtained from the IEEE by sending a request to pubs-permissions@ieee.org.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c402t-dbdd79147660d6ee4fc24e1bc5e76f3faa01a44f13d94e60349cbba94ed0270e3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-3662-8743
PMID 26441412
PQID 1787856033
PQPubID 85460
PageCount 12
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_7340334
crossref_citationtrail_10_1109_TMI_2015_2482920
crossref_primary_10_1109_TMI_2015_2482920
ieee_primary_7279156
proquest_miscellaneous_1788540709
proquest_journals_1787856033
pubmed_primary_26441412
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2016-05-01
PublicationDateYYYYMMDD 2016-05-01
PublicationDate_xml – month: 05
  year: 2016
  text: 2016-05-01
  day: 01
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on medical imaging
PublicationTitleAbbrev TMI
PublicationTitleAlternate IEEE Trans Med Imaging
PublicationYear 2016
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref56
ref12
ref15
ref58
ref53
ref11
ref10
krizhevsky (ref14) 2012
ref16
tajbakhsh (ref52) 2015
yosinski (ref57) 2014
organization (ref1) 2014
tajbakhsh (ref55) 2015
szegedy (ref60) 2014
ref51
krizhevsky (ref31) 2014
ref46
ref45
ref48
ref47
ref42
ref41
ref44
ref43
ref49
ref8
ref7
ref9
ref4
ref3
ref5
ref40
simonyan (ref50) 2014
seff (ref6) 2014
ref35
ref37
ref36
ref33
ref32
prasoon (ref19) 2013
ciresan (ref18) 2012
ref2
ref39
ref38
park (ref54) 2015
cherry (ref25) 2014
liu (ref26) 2014
simonyan (ref59) 2014
ref24
ref23
ciresan (ref17) 2013
ref20
ref22
ref21
yao (ref34) 2006
ref27
hinton (ref29) 2012
srivastava (ref30) 2014; 15
wan (ref28) 2013
References_xml – ident: ref15
  doi: 10.1162/neco.1989.1.4.541
– ident: ref46
  doi: 10.1117/12.2008282
– year: 2015
  ident: ref55
  article-title: A comprehensive computer-aided polyp detection system for colonoscopy videos
  publication-title: Inf Process Med Imag
– ident: ref7
  doi: 10.1109/TMI.2007.892510
– start-page: 2843
  year: 2012
  ident: ref18
  article-title: Deep neural networks segment neuronal membranes in electron microscopy images
  publication-title: Adv Neural Inf Process Syst
– ident: ref58
  doi: 10.1109/CVPR.2014.81
– ident: ref20
  doi: 10.1007/978-3-319-10404-1_65
– volume: 15
  start-page: 1929
  year: 2014
  ident: ref30
  article-title: Dropout: A simple way to prevent neural networks from overfitting
  publication-title: J Mach Learn Res
– ident: ref37
  doi: 10.1109/CVPR.2011.5995359
– ident: ref53
  doi: 10.1007/978-3-319-19992-4_46
– ident: ref27
  doi: 10.1053/j.gastro.2005.08.054
– year: 2014
  ident: ref31
  article-title: One weird trick for parallelizing convolutional neural networks
  publication-title: arXiv preprint arXiv 1404 5997
– year: 2015
  ident: ref54
  article-title: Polyp detection in colonoscopy videos using deeply-learned hierarchical features
  publication-title: Seoul nat'l univ
– ident: ref48
  doi: 10.1145/1390156.1390258
– year: 2012
  ident: ref29
  article-title: Improving neural networks by preventing co-adaptation of feature detectors
  publication-title: arXiv preprint arXiv 1207 0580
– ident: ref12
  doi: 10.1186/1475-925X-13-41
– ident: ref9
  doi: 10.1148/radiol.2252011619
– year: 2014
  ident: ref60
  article-title: Going deeper with convolutions
  publication-title: CoRR abs/1409 4842
– ident: ref56
  doi: 10.1056/NEJMoa0800996
– ident: ref40
  doi: 10.1007/978-3-319-14104-6_16
– ident: ref4
  doi: 10.1148/radiol.13121351
– ident: ref51
  doi: 10.1109/CVPR.2014.223
– ident: ref44
  doi: 10.1109/TMI.2011.2168234
– ident: ref39
  doi: 10.1016/j.patcog.2008.09.034
– ident: ref23
  doi: 10.1162/neco.2009.10-08-881
– ident: ref11
  doi: 10.1109/ISBI.2015.7163869
– year: 2012
  ident: ref14
  article-title: ImageNet classification with deep convolutional neural networks
  publication-title: NIPS
– ident: ref13
  doi: 10.1016/S0031-3203(03)00192-4
– ident: ref35
  doi: 10.1117/12.652288
– year: 2013
  ident: ref19
  article-title: Deep feature learning for knee cartilage segmentation using a triplanar convolutional neural network
  publication-title: MICCAI
– start-page: 568
  year: 2014
  ident: ref50
  article-title: Two-stream convolutional networks for action recognition in videos
  publication-title: Adv Neural Inform Process Syst
– ident: ref38
  doi: 10.3390/a3010021
– start-page: 390
  year: 2006
  ident: ref34
  article-title: Automated spinal column extraction and partitioning
  publication-title: Proc IEEE Int Symp IEEE Biomed Imag Nano to Macro
– ident: ref49
  doi: 10.1117/12.811101
– ident: ref22
  doi: 10.1109/ICARCV.2014.7064414
– start-page: 3320
  year: 2014
  ident: ref57
  article-title: How transferable are features in deep neural networks?
  publication-title: Adv Neural Inform Process Syst
– ident: ref3
  doi: 10.1117/12.911700
– ident: ref42
  doi: 10.1007/978-3-540-88693-8_34
– year: 2014
  ident: ref59
  article-title: Very deep convolutional networks for large-scale image recognition
  publication-title: arXiv preprint arXiv 1409 1556
– year: 2014
  ident: ref1
  publication-title: Cancer Fact Sheet
– ident: ref43
  doi: 10.1007/978-3-642-04271-3_122
– year: 2014
  ident: ref25
  article-title: Abdominal lymphadenopathy detection using random forest
  publication-title: SPIE Med Imag
– year: 2015
  ident: ref52
  article-title: Computer-aided pulmonary embolism detection using a novel vessel-aligned multi-planar image representation and convolutional neural networks
  publication-title: Proc MICCAI
– start-page: 544
  year: 2014
  ident: ref6
  publication-title: MICCAI
– year: 2013
  ident: ref28
  article-title: Regularization of neural networks using DropConnect
  publication-title: Proc Int Conf Mach Learn
– ident: ref45
  doi: 10.1016/j.media.2012.11.001
– year: 2014
  ident: ref26
  article-title: Mediastinal lymph node detection on thoracic CT scans using spatial prior from multi-atlas label fusion
  publication-title: SPIE Med Imag
– ident: ref5
  doi: 10.1007/s00330-013-2774-5
– ident: ref10
  doi: 10.1109/TMI.2009.2028576
– year: 2013
  ident: ref17
  article-title: Mitosis detection in breast cancer histology images with deep neural networks
  publication-title: MICCAI
– ident: ref24
  doi: 10.1145/2063576.2064004
– ident: ref2
  doi: 10.1016/j.beem.2008.01.011
– ident: ref32
  doi: 10.1109/42.974920
– ident: ref16
  doi: 10.1038/505146a
– ident: ref8
  doi: 10.1109/CVPR.2010.5540008
– ident: ref41
  doi: 10.1007/11866763_57
– ident: ref47
  doi: 10.1145/1961189.1961199
– ident: ref33
  doi: 10.1109/ISBI.2011.5872376
– ident: ref21
  doi: 10.1007/978-3-319-14148-0_1
– ident: ref36
  doi: 10.1117/12.594547
SSID ssj0014509
Score 2.6443954
Snippet Automated computer-aided detection (CADe) has been an important tool in clinical practice and research. State-of-the-art methods often show high sensitivities...
Automated computer-aided detection (CADe) in medical imaging has been an important tool in clinical practice and research. State-of-the-art methods often show...
SourceID pubmedcentral
proquest
pubmed
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1170
SubjectTerms Adolescent
Adult
Aged
artificial neural networks
Child
Colonic polyps
Colonic Polyps - diagnostic imaging
Computed tomography
Computer aided diagnosis
Databases, Factual
deep learning
Feature extraction
Female
Humans
Lymph nodes
Lymph Nodes - diagnostic imaging
Machine Learning
Male
medical diagnostic imaging
Middle Aged
multi-layer neural network
Neural networks
Neural Networks, Computer
object detection
Patients
Radiographic Image Interpretation, Computer-Assisted - methods
Spinal Neoplasms - diagnostic imaging
Three-dimensional displays
Tomography, X-Ray Computed
Training
Young Adult
Title Improving Computer-Aided Detection Using Convolutional Neural Networks and Random View Aggregation
URI https://ieeexplore.ieee.org/document/7279156
https://www.ncbi.nlm.nih.gov/pubmed/26441412
https://www.proquest.com/docview/1787856033
https://www.proquest.com/docview/1788540709
https://pubmed.ncbi.nlm.nih.gov/PMC7340334
Volume 35
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB61PVRwANryCJTKSL0gkd048SM5roCqRVoOqK16i2J70q6ALKJZkPj1jJ2HulVV9ZJEsuPYmnE84_n8DcAhR64kz-uYS2t9mFHGlXYyNppcFlp_ExvomuZf1fGZ-HIhLzbgw3gWBhED-Awn_jHE8t3SrvxW2ZTW2oL8jU3YJMetO6s1RgyE7OAcqWeMTVQ6hCSTYno6P_EYLjlJRe6TM3kCYG8GCJ6urUYhvcpdluZtwOSNFejoKcyHvnfAk--TVWsm9t8tWseHDu4ZPOlNUTbrdGcHNrDZhcc3CAp3YXveh973wIzbD2zIBBHPFg4d-4RtwHM1LOAPqLj50ys0Ne_ZP8ItwM2vWdU49o0uy5_sfIF_2eySPP7LoB_P4ezo8-nH47hP0BBbcjvb2BnnqNNCK5U4hShqmwrkxkrUqs7qqkp4JUTNM1cIVJ4KxxpT0bMjOSWYvYCtZtngK2C10GgzaTTW5KxjboqsMGQMiVQ7xyVGMB0EVdqevdwn0fhRBi8mKUqScumlXPZSjuD9-Mavjrnjnrp7XiBjvV4WEewPulD2U_u65PSLy8lOzLII3o3FNCl9pKVqcLkKdXLPbJgUEbzsVGdse1C9CPSaUo0VPOH3ekmzuArE3zoT9Fnx-u7evoFHNCbVoTH3Yav9vcK3ZDG15iBMlf9I4BFp
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VIvE48Gh5BAoYiQsS2Y0TO06OK6DaQtMD2qLeotielFVLFtEsSPx6xs5D3apCXJJIdhxbM87MZL58A_CGI08lz-qQS2NcmlGGlbIy1IpCFrK_kfF0TcVROj8Wn07kyRa8G_-FQUQPPsOJu_S5fLsya_epbEq2Nqd44wbcJLsvefe31pgzELIDdMSOMzZK4yEpGeXTRXHgUFxyEovMlWdyFMDOERA83rBHvsDKdb7mVcjkJRu0fx-KYfYd9ORssm71xPy5Quz4v8t7APd6Z5TNOu15CFvY7MDdSxSFO3Cr6JPvu6DHDxBsqAURzpYWLfuArUd0NcwjEKi5-dWrNA3v-D_8yQPOL1jVWPaFDqvv7OsSf7PZKcX8p15DHsHx_sfF-3nYl2gIDQWebWi1tTRpodI0simiqE0skGsjUaV1UldVxCshap7YXGDqyHCM1hVdW5JThMlj2G5WDT4FVguFJpFaYU3hOmY6T3JN7pCIlbVcYgDTQVCl6fnLXRmN89LHMVFekpRLJ-Wyl3IAb8c7fnTcHf_ou-sEMvbrZRHA3qALZb-5L0pOL7mMPMUkCeD12Ezb0uVaqgZXa98nc9yGUR7Ak051xrEH1QtAbSjV2MFRfm-2NMtvnvpbJYIeK55dP9tXcHu-KA7Lw4Ojz8_hDq0v7bCZe7Dd_lzjC_KfWv3Sb5u_psEUsg
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Improving+Computer-Aided+Detection+Using+Convolutional+Neural+Networks+and+Random+View+Aggregation&rft.jtitle=IEEE+transactions+on+medical+imaging&rft.au=Roth%2C+Holger+R&rft.au=Lu%2C+Le&rft.au=Liu%2C+Jiamin&rft.au=Yao%2C+Jianhua&rft.date=2016-05-01&rft.eissn=1558-254X&rft.volume=35&rft.issue=5&rft.spage=1170&rft_id=info:doi/10.1109%2FTMI.2015.2482920&rft_id=info%3Apmid%2F26441412&rft.externalDocID=26441412
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0278-0062&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0278-0062&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0278-0062&client=summon