Deep Feature Learning for Medical Image Analysis with Convolutional Autoencoder Neural Network

At present, computed tomography (CT) is widely used to assist disease diagnosis. Especially, computer aided diagnosis (CAD) based on artificial intelligence (AI) recently exhibits its importance in intelligent healthcare. However, it is a great challenge to establish an adequate labeled dataset for...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on big data Vol. 7; no. 4; pp. 750 - 758
Main Authors Chen, Min, Shi, Xiaobo, Zhang, Yin, Wu, Di, Guizani, Mohsen
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.10.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN2332-7790
2372-2096
DOI10.1109/TBDATA.2017.2717439

Cover

Loading…
Abstract At present, computed tomography (CT) is widely used to assist disease diagnosis. Especially, computer aided diagnosis (CAD) based on artificial intelligence (AI) recently exhibits its importance in intelligent healthcare. However, it is a great challenge to establish an adequate labeled dataset for CT analysis assistance, due to the privacy and security issues. Therefore, this paper proposes a convolutional autoencoder deep learning framework to support unsupervised image features learning for lung nodule through unlabeled data, which only needs a small amount of labeled data for efficient feature learning. Through comprehensive experiments, it shows that the proposed scheme is superior to other approaches, which effectively solves the intrinsic labor-intensive problem during artificial image labeling. Moreover, it verifies that the proposed convolutional autoencoder approach can be extended for similarity measurement of lung nodules images. Especially, the features extracted through unsupervised learning are also applicable in other related scenarios.
AbstractList At present, computed tomography (CT) is widely used to assist disease diagnosis. Especially, computer aided diagnosis (CAD) based on artificial intelligence (AI) recently exhibits its importance in intelligent healthcare. However, it is a great challenge to establish an adequate labeled dataset for CT analysis assistance, due to the privacy and security issues. Therefore, this paper proposes a convolutional autoencoder deep learning framework to support unsupervised image features learning for lung nodule through unlabeled data, which only needs a small amount of labeled data for efficient feature learning. Through comprehensive experiments, it shows that the proposed scheme is superior to other approaches, which effectively solves the intrinsic labor-intensive problem during artificial image labeling. Moreover, it verifies that the proposed convolutional autoencoder approach can be extended for similarity measurement of lung nodules images. Especially, the features extracted through unsupervised learning are also applicable in other related scenarios.
Author Chen, Min
Zhang, Yin
Guizani, Mohsen
Shi, Xiaobo
Wu, Di
Author_xml – sequence: 1
  givenname: Min
  orcidid: 0000-0002-0960-4447
  surname: Chen
  fullname: Chen, Min
  email: minchen@ieee.org
  organization: Wuhan National Laboratory for Optoelectronics (WNLO) and with School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan, China
– sequence: 2
  givenname: Xiaobo
  surname: Shi
  fullname: Shi, Xiaobo
  email: xiaoboshi.cs@qq.com
  organization: College of Computer and Information Engineering, Henan Normal University, Xinxiang, China
– sequence: 3
  givenname: Yin
  orcidid: 0000-0002-1772-0763
  surname: Zhang
  fullname: Zhang, Yin
  email: yin.zhang.cn@ieee.org
  organization: School of Information and Safety Engineering, Zhongnan University of Economics and Law, Wuhan, China
– sequence: 4
  givenname: Di
  surname: Wu
  fullname: Wu, Di
  email: wudi27@sysu.edu.cn
  organization: Department of Computer Science, School of Data and Computer Science, Sun Yat-sen University, Guangzhou, China
– sequence: 5
  givenname: Mohsen
  orcidid: 0000-0002-8972-8094
  surname: Guizani
  fullname: Guizani, Mohsen
  email: mguizani@ieee.org
  organization: Electrical and Computer Engineering Department, University of Idaho, Moscow, ID, USA
BookMark eNqFkE1v2zAMhoWhBZa1_QW9CNjZKSXFVnT00vUDyLpLdp3ByFSq1LEySW7Rf18HKXrYpScSxPsQ5PONnfShJ8YuBUyFAHO1-nFdr-qpBKGnUgs9U-YLm0ilZSHBVCeHXslCawNf2UVKWwAQFYAycsL-XhPt-Q1hHiLxJWHsfb_hLkT-i1pvseP3O9wQr3vsXpNP_MXnR74I_XPohuzDOOb1kAP1NrQU-QMNcRw9UH4J8emcnTrsEl281zP25-bnanFXLH_f3i_qZWGl1Llo1ZzACaA1kiVYo3BQzpU0CGi1qtCWys1kq2RFc4NV2brZunVWVc6itajO2Pfj3n0M_wZKudmGIY63pUaWFQilhDRjyhxTNoaUIrnG-oyHJ3JE3zUCmoPR5mi0ORht3o2OrPqP3Ue_w_j6CXV5pDwRfRDalDMQUr0BpX2FqQ
CODEN ITBDAX
CitedBy_id crossref_primary_10_1002_srin_202400896
crossref_primary_10_1016_j_future_2019_12_033
crossref_primary_10_1007_s10489_020_01944_5
crossref_primary_10_2217_fon_2021_0767
crossref_primary_10_1109_JIOT_2021_3105647
crossref_primary_10_1016_j_jrras_2024_101022
crossref_primary_10_1109_ACCESS_2024_3487851
crossref_primary_10_1016_j_future_2022_11_001
crossref_primary_10_1109_ACCESS_2023_3251105
crossref_primary_10_1109_TNNLS_2020_3029033
crossref_primary_10_1007_s00170_022_09608_z
crossref_primary_10_1016_j_techfore_2024_123542
crossref_primary_10_32604_cmes_2022_021225
crossref_primary_10_1007_s11042_024_18418_2
crossref_primary_10_1287_ijoc_2021_1117
crossref_primary_10_1177_14759217221139134
crossref_primary_10_3390_math11112466
crossref_primary_10_1002_hbm_26077
crossref_primary_10_1088_1742_6596_2458_1_012045
crossref_primary_10_1016_j_cmpb_2021_106505
crossref_primary_10_1002_widm_1478
crossref_primary_10_1016_j_engappai_2024_108935
crossref_primary_10_1109_ACCESS_2019_2908991
crossref_primary_10_1061_JCEMD4_COENG_13686
crossref_primary_10_1109_ACCESS_2021_3098510
crossref_primary_10_47164_ijngc_v15i3_1773
crossref_primary_10_1109_JIOT_2023_3262882
crossref_primary_10_1038_s41467_024_49381_z
crossref_primary_10_1080_19393555_2021_1883777
crossref_primary_10_1007_s13369_022_06587_x
crossref_primary_10_1016_j_neucom_2019_10_067
crossref_primary_10_3390_bdcc5040078
crossref_primary_10_1109_JIOT_2023_3281509
crossref_primary_10_1007_s11036_017_0929_3
crossref_primary_10_1039_D2DD00066K
crossref_primary_10_3390_app121910152
crossref_primary_10_1049_hve2_12210
crossref_primary_10_1109_ACCESS_2020_3027812
crossref_primary_10_1029_2024GH001253
crossref_primary_10_1109_TBCAS_2021_3090995
crossref_primary_10_3390_diagnostics12061504
crossref_primary_10_1142_S0218126624501810
crossref_primary_10_1007_s12541_021_00566_2
crossref_primary_10_1016_j_iot_2023_100713
crossref_primary_10_3390_app112311461
crossref_primary_10_1007_s10278_024_01212_9
crossref_primary_10_1016_j_comcom_2019_08_002
crossref_primary_10_5050_KSNVE_2021_31_5_563
crossref_primary_10_1016_j_compbiomed_2024_109301
crossref_primary_10_1109_TBME_2021_3124487
crossref_primary_10_1109_ACCESS_2018_2789428
crossref_primary_10_1175_JTECH_D_21_0007_1
crossref_primary_10_3390_math11081777
crossref_primary_10_1016_j_future_2019_07_012
crossref_primary_10_1016_j_neucom_2018_12_036
crossref_primary_10_1088_2631_8695_ada8f8
crossref_primary_10_22399_ijcesen_1225
crossref_primary_10_1002_spe_3079
crossref_primary_10_1016_j_comcom_2019_12_016
crossref_primary_10_1002_mp_16352
crossref_primary_10_1007_s00500_023_09602_x
crossref_primary_10_1007_s10916_018_1088_1
crossref_primary_10_32604_cmc_2023_033417
crossref_primary_10_1109_ACCESS_2018_2888839
crossref_primary_10_1109_TCBB_2022_3201295
crossref_primary_10_1007_s00170_023_11255_x
crossref_primary_10_3390_tomography8030108
crossref_primary_10_1007_s11831_022_09785_w
crossref_primary_10_1080_00396265_2023_2203024
crossref_primary_10_1109_TBDATA_2022_3225194
crossref_primary_10_3233_JIFS_169910
crossref_primary_10_3390_app13127281
crossref_primary_10_3390_math10050829
crossref_primary_10_1038_s41598_023_37569_0
crossref_primary_10_1109_TCBB_2023_3321593
crossref_primary_10_3390_rs16193562
crossref_primary_10_1109_ACCESS_2020_3007163
crossref_primary_10_1109_ACCESS_2024_3351688
crossref_primary_10_1111_mice_13132
crossref_primary_10_1007_s11042_018_5631_3
crossref_primary_10_1007_s12652_020_02841_y
crossref_primary_10_1111_coin_12528
crossref_primary_10_1007_s11042_023_17324_3
crossref_primary_10_1016_j_jbo_2024_100645
crossref_primary_10_1109_TBDATA_2022_3187413
crossref_primary_10_1109_TIM_2023_3249224
crossref_primary_10_1007_s11042_021_10637_1
crossref_primary_10_23838_pfm_2018_00030
crossref_primary_10_1109_JBHI_2022_3208779
crossref_primary_10_1007_s00521_020_05046_8
crossref_primary_10_1016_j_rinp_2021_104287
crossref_primary_10_1007_s11042_019_08118_7
crossref_primary_10_1109_TSM_2020_3013004
crossref_primary_10_1364_OE_475495
crossref_primary_10_1007_s42979_021_00784_5
crossref_primary_10_3390_electronics11233967
crossref_primary_10_1109_ACCESS_2021_3104849
crossref_primary_10_1109_TBDATA_2019_2923243
crossref_primary_10_1109_TSE_2023_3294971
crossref_primary_10_1002_smll_202107620
crossref_primary_10_3390_cancers15020357
crossref_primary_10_1109_ACCESS_2017_2769666
crossref_primary_10_1039_D4CS00844H
crossref_primary_10_1109_ACCESS_2019_2909538
crossref_primary_10_1109_TMC_2020_3045266
crossref_primary_10_1121_10_0034857
crossref_primary_10_1007_s11042_017_5362_x
crossref_primary_10_1109_TIP_2019_2931461
crossref_primary_10_1109_ACCESS_2022_3158342
crossref_primary_10_32604_cmes_2021_016728
crossref_primary_10_1016_j_patrec_2024_06_003
crossref_primary_10_1007_s11063_021_10490_1
crossref_primary_10_3390_s24030851
crossref_primary_10_1007_s11760_025_03984_1
crossref_primary_10_3390_diagnostics12112569
crossref_primary_10_1007_s12652_020_02328_w
crossref_primary_10_1111_jfr3_12683
crossref_primary_10_1016_j_energy_2022_124689
crossref_primary_10_3390_diagnostics13132153
crossref_primary_10_1016_j_promfg_2021_06_005
crossref_primary_10_1007_s12350_022_03007_3
crossref_primary_10_1109_ACCESS_2020_3038764
crossref_primary_10_18006_2023_11_2__226_235
crossref_primary_10_1109_ACCESS_2024_3475480
crossref_primary_10_1016_j_bspc_2024_107273
crossref_primary_10_1109_JSEN_2019_2939908
crossref_primary_10_1109_TFUZZ_2024_3388023
crossref_primary_10_1109_JSEN_2022_3179405
crossref_primary_10_3390_ai5020024
crossref_primary_10_1016_j_engappai_2022_104978
crossref_primary_10_1109_TETC_2021_3068063
crossref_primary_10_1109_ACCESS_2017_2771220
crossref_primary_10_1016_j_jclepro_2022_133638
crossref_primary_10_1016_j_ymssp_2023_110736
crossref_primary_10_1007_s10462_024_10748_9
crossref_primary_10_1186_s40537_023_00705_8
crossref_primary_10_1016_j_compbiomed_2022_106065
crossref_primary_10_1016_j_inffus_2019_06_019
crossref_primary_10_1016_j_array_2023_100287
crossref_primary_10_1007_s11036_018_1114_z
crossref_primary_10_1016_j_iswa_2023_200183
crossref_primary_10_1016_j_jcp_2023_112581
crossref_primary_10_3390_app14177822
crossref_primary_10_1016_j_engappai_2023_107045
crossref_primary_10_32604_cmes_2022_022495
crossref_primary_10_1109_TSG_2018_2883795
crossref_primary_10_1103_PhysRevApplied_16_064039
crossref_primary_10_1186_s13634_022_00941_9
crossref_primary_10_1016_j_nxener_2023_100046
crossref_primary_10_1016_j_vrih_2022_01_006
crossref_primary_10_1109_ACCESS_2024_3495732
crossref_primary_10_1109_ACCESS_2020_2999349
crossref_primary_10_1155_2022_4241016
crossref_primary_10_1109_TIM_2020_3028438
crossref_primary_10_1007_s10586_024_04852_2
crossref_primary_10_1109_TITS_2020_3023958
crossref_primary_10_1007_s00138_025_01665_0
crossref_primary_10_1016_j_jrras_2024_100932
crossref_primary_10_1109_MNET_2019_1800371
crossref_primary_10_3389_fcomp_2024_1404494
crossref_primary_10_1109_MNET_011_2000353
crossref_primary_10_1007_s11042_024_19333_2
crossref_primary_10_3390_app11198867
crossref_primary_10_3390_plants13213041
crossref_primary_10_2166_hydro_2023_030
crossref_primary_10_1007_s11548_019_02097_8
crossref_primary_10_1109_ACCESS_2024_3472044
crossref_primary_10_1002_hbm_26799
crossref_primary_10_1016_j_eswa_2025_126481
crossref_primary_10_1016_j_knosys_2024_112121
crossref_primary_10_1088_1361_6579_ac7fd9
crossref_primary_10_3390_app14052063
crossref_primary_10_3390_jimaging9070137
crossref_primary_10_1038_s41598_022_18085_z
crossref_primary_10_1109_JTEHM_2021_3077760
crossref_primary_10_1016_j_knosys_2022_109731
crossref_primary_10_4204_EPTCS_361_8
crossref_primary_10_3390_diagnostics11101920
crossref_primary_10_1016_j_bspc_2023_105263
crossref_primary_10_1038_s41598_024_75964_3
crossref_primary_10_1109_TMM_2017_2769807
crossref_primary_10_1007_s11042_024_20171_5
crossref_primary_10_3389_frai_2023_1289669
crossref_primary_10_3390_s22249875
crossref_primary_10_1007_s11263_025_02409_3
crossref_primary_10_1109_ACCESS_2019_2920734
crossref_primary_10_1109_TIP_2020_2967589
crossref_primary_10_3390_bioengineering11060586
crossref_primary_10_1007_s00500_020_05140_y
crossref_primary_10_1016_j_energy_2024_130866
crossref_primary_10_3389_fphys_2021_667828
crossref_primary_10_26599_TST_2022_9010016
crossref_primary_10_1109_TSC_2021_3133673
crossref_primary_10_1007_s10462_023_10513_4
crossref_primary_10_1371_journal_pone_0313261
crossref_primary_10_1007_s11053_020_09789_y
crossref_primary_10_1016_j_cma_2022_115768
crossref_primary_10_1109_ACCESS_2020_2975198
Cites_doi 10.1016/j.media.2016.10.004
10.1126/science.1127647
10.1109/TMI.2016.2526687
10.1007/978-3-319-19992-4_46
10.1145/1390156.1390294
10.1109/CRV.2015.25
10.1109/ACCESS.2016.2641480
10.1162/neco.2006.18.7.1527
10.1109/TPAMI.2012.231
10.1109/TCSVT.2016.2539698
10.1162/089976602760128018
10.1109/5.726791
10.1118/1.3140589
10.1109/MCOM.2017.1600410CM
10.1016/j.media.2015.02.002
10.1593/tlo.13844
10.3390/bdcc1010002
10.1109/TMI.2016.2532122
10.1109/CVPR.2014.244
10.1117/12.910710
10.1007/978-3-642-21735-7_7
10.1109/TMM.2016.2537782
10.1109/TMI.2015.2494582
10.1016/j.patcog.2016.05.029
10.1109/TPAMI.2013.50
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
DOI 10.1109/TBDATA.2017.2717439
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Electronics & Communications Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList Technology Research Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 2372-2096
EndPage 758
ExternalDocumentID 10_1109_TBDATA_2017_2717439
7954012
Genre orig-research
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61572220
  funderid: 10.13039/501100001809
– fundername: Fundamental Research Funds for the Central Universities
  grantid: 17LGJC23
  funderid: 10.13039/501100012226
– fundername: National Natural Science Foundation of China; National Science Foundation of China
  grantid: 61572538
  funderid: 10.13039/501100001809
GroupedDBID 0R~
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABJNI
ABQJQ
ABVLG
ACGFS
AGQYO
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
IEDLZ
IFIPE
IPLJI
JAVBF
M43
OCL
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
ID FETCH-LOGICAL-c227t-d38e0f10ebaece0ba1f058329a0ac736ac53f42d326e89a65df4bdfc36fcacca3
IEDL.DBID RIE
ISSN 2332-7790
IngestDate Sun Jun 29 16:33:05 EDT 2025
Tue Jul 01 03:27:37 EDT 2025
Thu Apr 24 22:52:34 EDT 2025
Wed Aug 27 02:27:20 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c227t-d38e0f10ebaece0ba1f058329a0ac736ac53f42d326e89a65df4bdfc36fcacca3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-8972-8094
0000-0002-1772-0763
0000-0002-0960-4447
PQID 2560133129
PQPubID 4437220
PageCount 9
ParticipantIDs proquest_journals_2560133129
ieee_primary_7954012
crossref_primary_10_1109_TBDATA_2017_2717439
crossref_citationtrail_10_1109_TBDATA_2017_2717439
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2021-Oct.-1
PublicationDateYYYYMMDD 2021-10-01
PublicationDate_xml – month: 10
  year: 2021
  text: 2021-Oct.-1
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE transactions on big data
PublicationTitleAbbrev TBData
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref34
hamidiana (ref8) 2017; 10134
ref37
ref15
ref36
ref14
ref30
ref33
bengio (ref31) 2007; 19
ref11
ref32
han (ref5) 2015
hwang (ref10) 2017
ref2
ref1
ref17
ref38
ref16
ref18
li (ref35) 2013
zhou (ref12) 2015; 12
patel (ref23) 2017; 10134
schlegl (ref28) 2017
ref25
fonseca (ref20) 2015; 9414
ref22
ref21
ciresan (ref19) 2012
gao (ref27) 2016
li (ref26) 2017; 10134
ref29
uemura (ref24) 2017
ref7
ref9
ref4
ref3
ref6
References_xml – start-page: 103
  year: 2017
  ident: ref24
  article-title: Classification of polyp candidates on CTC based on 3D-CNN
  publication-title: Proc Int Forum Med Imag Asia
– ident: ref22
  doi: 10.1016/j.media.2016.10.004
– volume: 10134
  start-page: 101340901
  year: 2017
  ident: ref8
  article-title: 3D convolutional neural network for automatic detection of lung nodules in chest
  publication-title: SPIE Med Imag
– ident: ref30
  doi: 10.1126/science.1127647
– ident: ref36
  doi: 10.1109/TMI.2016.2526687
– ident: ref7
  doi: 10.1007/978-3-319-19992-4_46
– ident: ref32
  doi: 10.1145/1390156.1390294
– ident: ref33
  doi: 10.1109/CRV.2015.25
– ident: ref14
  doi: 10.1109/ACCESS.2016.2641480
– ident: ref38
  doi: 10.1162/neco.2006.18.7.1527
– ident: ref17
  doi: 10.1109/TPAMI.2012.231
– ident: ref13
  doi: 10.1109/TCSVT.2016.2539698
– ident: ref29
  doi: 10.1162/089976602760128018
– ident: ref11
  doi: 10.1109/5.726791
– start-page: 14
  year: 2015
  ident: ref5
  article-title: A texture feature analysis for diagnosis of pulmonary nodules using LIDC-IDRI database
  publication-title: Proc IEEE Int Conf Med Imag Phys Eng
– ident: ref6
  doi: 10.1118/1.3140589
– ident: ref1
  doi: 10.1109/MCOM.2017.1600410CM
– start-page: 437
  year: 2017
  ident: ref28
  article-title: Predicting semantic descriptions from medical images with convolutional neural networks
  publication-title: Proc Int Conf Inf Process Med Imag
– ident: ref2
  doi: 10.1016/j.media.2015.02.002
– volume: 10134
  start-page: 241
  year: 2017
  ident: ref26
  article-title: Lung nodule malignancy prediction using multi-task convolutional neural network
  publication-title: SPIE Med Imag
– ident: ref3
  doi: 10.1593/tlo.13844
– ident: ref4
  doi: 10.3390/bdcc1010002
– ident: ref34
  doi: 10.1109/TMI.2016.2532122
– ident: ref16
  doi: 10.1109/CVPR.2014.244
– volume: 10134
  start-page: 201
  year: 2017
  ident: ref23
  article-title: Automatic cerebrospinal fluid segmentation in non-contrast CT images using a 3D convolutional network
  publication-title: SPIE Med Imag
– start-page: 1
  year: 2016
  ident: ref27
  article-title: Holistic classification of CT attenuation patterns for interstitial lung diseases via deep convolutional neural networks
  publication-title: Comput Methods Biomech Biomed Eng Imag Vis
– ident: ref21
  doi: 10.1117/12.910710
– start-page: 2843
  year: 2012
  ident: ref19
  article-title: Deep neural networks segment neuronal membranes in electron microscopy images
  publication-title: Proc Advances Neural Inf Process Syst
– ident: ref37
  doi: 10.1007/978-3-642-21735-7_7
– year: 2017
  ident: ref10
  publication-title: Big Data Analytics for Cloud/IoT and Cognitive Computing
– ident: ref15
  doi: 10.1109/TMM.2016.2537782
– volume: 9414
  start-page: 2801
  year: 2015
  ident: ref20
  article-title: Automatic breast density classification using a convolutional neural network architecture search procedure
  publication-title: SPIE Med Imag
– ident: ref18
  doi: 10.1109/TMI.2015.2494582
– ident: ref25
  doi: 10.1016/j.patcog.2016.05.029
– start-page: 6079
  year: 2013
  ident: ref35
  article-title: Lung image patch classification with automatic feature learning
  publication-title: Proc 35th Annu Int Conf IEEE Eng Med Biol Soc
– volume: 19
  start-page: 153
  year: 2007
  ident: ref31
  article-title: Greedy layer-wise training of deep networks
  publication-title: Proc 19th Int Conf Neural Inf Process Syst
– ident: ref9
  doi: 10.1109/TPAMI.2013.50
– volume: 12
  start-page: 1253
  year: 2015
  ident: ref12
  article-title: Mobile device-to-device video distribution: Theory and application
  publication-title: ACM Trans Multimedia Comput Commun Appl
SSID ssj0001600392
Score 2.539986
Snippet At present, computed tomography (CT) is widely used to assist disease diagnosis. Especially, computer aided diagnosis (CAD) based on artificial intelligence...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 750
SubjectTerms Artificial intelligence
Biomedical imaging
Computed tomography
Convolutional autoencoder neural network
Convolutional codes
Deep learning
Diagnosis
Feature extraction
feature learning
hand-craft feature
Image analysis
lung nodule
Lungs
Medical imaging
Neural networks
Nodules
Training
unsupervised learning
Title Deep Feature Learning for Medical Image Analysis with Convolutional Autoencoder Neural Network
URI https://ieeexplore.ieee.org/document/7954012
https://www.proquest.com/docview/2560133129
Volume 7
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1NT8IwGG6Qkxe_0Iii6cEjw9Gylh0RJGgiJ0g8ufTjrQcVCG4e_PW2XYfxI8bbsrRbk-dt-7R9-rwIXQjKLSlgJuoSGdsFSl9GQlsi11dap5oligivtpiyybx3e5_c11B7cxcGALz4DDru0Z_l66Uq3FbZJU8tv3AphbdsmJV3tT73U5i7ZkqCsVA3Ti9nV6PBbODUW7xDuGfeXyYfn03lxxDs55XxLrqrWlTKSZ46RS476v2bWeN_m7yHdgLBxIMyIvZRDRYHaLdK3oBDX26ghxHACjsGWKwBB5vVR2w5LA6HN_jmxQ42uLItwW7LFg-Xi7cQre4vRb50TpjaftnZfNhX01JXfojm4-vZcBKFZAuRIoTnkaZ9iE03BilAQSxF18SJ7e6piIXilAmVUNMj2iIL_VSwRJue1EZRZpSwYUCPUH2xXMAxwrSXpMbyTiXBedsQmRoFwg6knAMjnDURqVDIVHAidwkxnjO_IonTrIQuc9BlAbomam8qrUojjr-LNxwYm6IBhyZqVXBnobO-Zn5VSqllPie_1zpF28RJWbyGr4Xq-bqAM8tFcnnug_ADdo3djw
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PT9swGP2E2AEulFEQHbD5wJGU1E7s5tjBUIHSU5E4LfKPzxy2tVWX7MBfj-04RQw07RZFdmLpfbaf7ef3AZxKJhwp4DYZUJW6BcpQJdI4IjfUxhSG55rKoLaY8vF9dvOQP2zA2fouDCIG8Rn2_WM4yzcLXfutsnNROH7hUwp_cPN-lje3tV52VLi_aEqjtdAgLc5nXy9Hs5HXb4k-FYF7v5p-Qj6VN4NwmFmuOnDXtqkRlPzo15Xq66e_7Br_t9G7sBMpJhk1MfERNnC-B502fQOJvbkL3y8Rl8RzwHqFJBqtPhLHYkk8viHXv9xwQ1rjEuI3bcnFYv4nxqv_S10tvBemcV_2Rh_u1bRRlu_D_dW32cU4iekWEk2pqBLDhpjaQYpKosZUyYFNc9fhC5lKLRiXOmc2o8Zhi8NC8tzYTBmrGbdaukBgB7A5X8zxEAjL8sI65qkVencbqgqrUbqhVAjkVPAe0BaFUkcvcp8S42cZ1iRpUTbQlR66MkLXg7N1pWVjxfHv4l0PxrpoxKEHxy3cZeyuv8uwLmXMcZ9P79f6Alvj2d2knFxPb49gm3phS1D0HcNmtarxxDGTSn0OAfkMNT3g3A
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deep+Feature+Learning+for+Medical+Image+Analysis+with+Convolutional+Autoencoder+Neural+Network&rft.jtitle=IEEE+transactions+on+big+data&rft.au=Chen%2C+Min&rft.au=Shi%2C+Xiaobo&rft.au=Zhang%2C+Yin&rft.au=Wu%2C+Di&rft.date=2021-10-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.eissn=2372-2096&rft.volume=7&rft.issue=4&rft.spage=750&rft_id=info:doi/10.1109%2FTBDATA.2017.2717439&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2332-7790&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2332-7790&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2332-7790&client=summon