SDDNet: Real-Time Crack Segmentation

This article reports the development of a pure deep learning method for segmenting concrete cracks in images. The objectives are to achieve the real-time performance while effectively negating a wide range of various complex backgrounds and crack-like features. To achieve the goals, an original conv...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on industrial electronics (1982) Vol. 67; no. 9; pp. 8016 - 8025
Main Authors Choi, Wooram, Cha, Young-Jin
Format Journal Article
LanguageEnglish
Published New York IEEE 01.09.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract This article reports the development of a pure deep learning method for segmenting concrete cracks in images. The objectives are to achieve the real-time performance while effectively negating a wide range of various complex backgrounds and crack-like features. To achieve the goals, an original convolutional neural network is proposed. The model consists of standard convolutions, densely connected separable convolution modules, a modified atrous spatial pyramid pooling module, and a decoder module. The semantic damage detection network (SDDNet) is trained on a manually created crack dataset, and the trained network records the mean intersection-over-union of 0.846 on the test set. Each test image is analyzed, and the representative segmentation results are presented. The results show that the SDDNet segments cracks effectively unless the features are too faint. The proposed model is also compared with the most recent models, which show that it returns better evaluation metrics even though its number of parameters is 88 times less than in the compared models. In addition, the model processes in real-time (36 FPS) images at 1025 × 512 pixels, which is 46 times faster than in a recent work.
AbstractList This article reports the development of a pure deep learning method for segmenting concrete cracks in images. The objectives are to achieve the real-time performance while effectively negating a wide range of various complex backgrounds and crack-like features. To achieve the goals, an original convolutional neural network is proposed. The model consists of standard convolutions, densely connected separable convolution modules, a modified atrous spatial pyramid pooling module, and a decoder module. The semantic damage detection network (SDDNet) is trained on a manually created crack dataset, and the trained network records the mean intersection-over-union of 0.846 on the test set. Each test image is analyzed, and the representative segmentation results are presented. The results show that the SDDNet segments cracks effectively unless the features are too faint. The proposed model is also compared with the most recent models, which show that it returns better evaluation metrics even though its number of parameters is 88 times less than in the compared models. In addition, the model processes in real-time (36 FPS) images at 1025 × 512 pixels, which is 46 times faster than in a recent work.
This article reports the development of a pure deep learning method for segmenting concrete cracks in images. The objectives are to achieve the real-time performance while effectively negating a wide range of various complex backgrounds and crack-like features. To achieve the goals, an original convolutional neural network is proposed. The model consists of standard convolutions, densely connected separable convolution modules, a modified atrous spatial pyramid pooling module, and a decoder module. The semantic damage detection network (SDDNet) is trained on a manually created crack dataset, and the trained network records the mean intersection-over-union of 0.846 on the test set. Each test image is analyzed, and the representative segmentation results are presented. The results show that the SDDNet segments cracks effectively unless the features are too faint. The proposed model is also compared with the most recent models, which show that it returns better evaluation metrics even though its number of parameters is 88 times less than in the compared models. In addition, the model processes in real-time (36 FPS) images at 1025 × 512 pixels, which is 46 times faster than in a recent work.
Author Cha, Young-Jin
Choi, Wooram
Author_xml – sequence: 1
  givenname: Wooram
  orcidid: 0000-0001-9099-8334
  surname: Choi
  fullname: Choi, Wooram
  email: choiw@myumanitoba.ca
  organization: Civil Engineering Department, the University of Manitoba, Winnipeg, MB, Canada
– sequence: 2
  givenname: Young-Jin
  orcidid: 0000-0002-0738-5615
  surname: Cha
  fullname: Cha, Young-Jin
  email: young.cha@umanitoba.ca
  organization: Civil Engineering Department, the University of Manitoba, Winnipeg, MB, Canada
BookMark eNp9UE1Lw0AUXKSCbfUueCnoNfXtd9abtFULRcHG87LZvkhqm9RNeui_d0uKBw9e3oNh5s28GZBeVVdIyDWFMaVg7rP5bMyAmjEzQjIlz0ifSqkTY0TaI31gOk0AhLogg6ZZA1AhqeyTu-V0-ortw-gd3SbJyi2OJsH5r9ESP7dYta4t6-qSnBdu0-DVaQ_Jx9Msm7wki7fn-eRxkXjOeZukQisuUBc8L7hb5TQCAngM5JTKteOpF3IlAXRhDFvlwuWgGEPtvY-w5ENy293dhfp7j01r1_U-VNHSMgGUxpGqyIKO5UPdNAELuwvl1oWDpWCPXdjYhT12YU9dRIn6I_Fl91obXLn5T3jTCUtE_PVJYwrKOP8BMk9qPw
CODEN ITIED6
CitedBy_id crossref_primary_10_1016_j_autcon_2023_105262
crossref_primary_10_1061_JBENF2_BEENG_7032
crossref_primary_10_1016_j_autcon_2023_105022
crossref_primary_10_3390_machines11020169
crossref_primary_10_1016_j_tust_2022_104403
crossref_primary_10_3390_buildings13010118
crossref_primary_10_1016_j_autcon_2020_103484
crossref_primary_10_1109_ACCESS_2024_3368376
crossref_primary_10_1016_j_jobe_2024_110814
crossref_primary_10_1016_j_engappai_2024_107976
crossref_primary_10_1109_JSEN_2021_3089718
crossref_primary_10_1038_s41598_023_28530_2
crossref_primary_10_1109_JIOT_2024_3401217
crossref_primary_10_1109_TIM_2023_3298391
crossref_primary_10_3390_app14167194
crossref_primary_10_1007_s13349_023_00684_7
crossref_primary_10_1016_j_tust_2021_103949
crossref_primary_10_1109_ACCESS_2023_3287770
crossref_primary_10_1016_j_rineng_2023_101267
crossref_primary_10_1109_TITS_2024_3492731
crossref_primary_10_1016_j_autcon_2020_103371
crossref_primary_10_1109_TITS_2022_3141827
crossref_primary_10_1016_j_autcon_2023_104840
crossref_primary_10_1111_exsy_13784
crossref_primary_10_1016_j_measurement_2024_116639
crossref_primary_10_1016_j_autcon_2023_104839
crossref_primary_10_1080_10298436_2024_2317432
crossref_primary_10_1109_ACCESS_2023_3283613
crossref_primary_10_13168_cs_2024_0025
crossref_primary_10_1016_j_autcon_2021_103989
crossref_primary_10_1109_ACCESS_2023_3330142
crossref_primary_10_1177_14759217221123485
crossref_primary_10_1109_ACCESS_2020_3011106
crossref_primary_10_1049_ipr2_12976
crossref_primary_10_1007_s13534_024_00415_x
crossref_primary_10_1002_stc_2910
crossref_primary_10_1038_s41598_024_81119_1
crossref_primary_10_1364_OE_435230
crossref_primary_10_1016_j_autcon_2024_105328
crossref_primary_10_1111_mice_12613
crossref_primary_10_3390_s23218824
crossref_primary_10_1016_j_compind_2023_103921
crossref_primary_10_1109_ACCESS_2024_3492193
crossref_primary_10_1080_09349847_2023_2180559
crossref_primary_10_1080_19475683_2023_2166112
crossref_primary_10_1007_s00530_022_01008_3
crossref_primary_10_1016_j_aei_2023_102214
crossref_primary_10_1109_ACCESS_2023_3262702
crossref_primary_10_1016_j_conbuildmat_2021_123896
crossref_primary_10_1109_TITS_2024_3432995
crossref_primary_10_1109_JSEN_2023_3281585
crossref_primary_10_1007_s11440_023_01889_2
crossref_primary_10_1016_j_ymssp_2024_112240
crossref_primary_10_1109_TRS_2024_3516413
crossref_primary_10_1016_j_eswa_2024_124950
crossref_primary_10_1016_j_autcon_2023_104743
crossref_primary_10_1038_s41598_024_63575_x
crossref_primary_10_1109_TITS_2024_3420763
crossref_primary_10_1016_j_autcon_2021_104022
crossref_primary_10_1016_j_autcon_2022_104568
crossref_primary_10_1016_j_autcon_2022_104689
crossref_primary_10_3390_s23010504
crossref_primary_10_1016_j_ijpvp_2023_105112
crossref_primary_10_1007_s00138_024_01591_7
crossref_primary_10_1109_JSEN_2021_3112005
crossref_primary_10_1016_j_aei_2022_101575
crossref_primary_10_1016_j_jii_2022_100403
crossref_primary_10_1016_j_autcon_2021_103606
crossref_primary_10_3233_JIFS_210475
crossref_primary_10_1080_10298436_2023_2286461
crossref_primary_10_1111_jmi_13098
crossref_primary_10_1016_j_conbuildmat_2024_138731
crossref_primary_10_1109_TITS_2023_3325989
crossref_primary_10_1177_14759217211053776
crossref_primary_10_1186_s12938_022_01008_4
crossref_primary_10_1049_ipr2_12512
crossref_primary_10_1080_02564602_2023_2242318
crossref_primary_10_1109_TITS_2022_3204334
crossref_primary_10_1007_s11554_021_01130_x
crossref_primary_10_1177_14759217241254748
crossref_primary_10_1016_j_measurement_2022_111550
crossref_primary_10_1002_2475_8876_12221
crossref_primary_10_1016_j_tust_2024_106108
crossref_primary_10_1109_TIM_2024_3417538
crossref_primary_10_1080_13467581_2023_2238038
crossref_primary_10_1016_j_autcon_2023_105069
crossref_primary_10_3390_app11136017
crossref_primary_10_1109_TITS_2022_3197712
crossref_primary_10_1155_2021_9923704
crossref_primary_10_1016_j_autcon_2023_104895
crossref_primary_10_1016_j_autcon_2023_104894
crossref_primary_10_1016_j_aei_2023_102279
crossref_primary_10_1109_ACCESS_2021_3105279
crossref_primary_10_1109_TITS_2023_3234330
crossref_primary_10_1177_14759217221150376
crossref_primary_10_1016_j_aei_2024_102670
crossref_primary_10_1177_14759217241271000
crossref_primary_10_1016_j_conbuildmat_2024_135151
crossref_primary_10_1109_ACCESS_2021_3073921
crossref_primary_10_1177_14759217241301098
crossref_primary_10_1016_j_autcon_2024_105354
crossref_primary_10_1007_s11042_024_19884_4
crossref_primary_10_3390_drones8120725
crossref_primary_10_1177_1369433220986638
crossref_primary_10_1109_TII_2024_3371982
crossref_primary_10_1177_1369433220986637
crossref_primary_10_1177_13694332241266538
crossref_primary_10_3390_s22062330
crossref_primary_10_1111_mice_12844
crossref_primary_10_1038_s41598_024_54835_x
crossref_primary_10_1109_TIM_2021_3075022
crossref_primary_10_1016_j_autcon_2024_105357
crossref_primary_10_1016_j_dsp_2025_105069
crossref_primary_10_1109_TITS_2021_3134374
crossref_primary_10_1016_j_ijcce_2024_12_003
crossref_primary_10_1007_s00530_024_01408_7
crossref_primary_10_1007_s11803_023_2153_4
crossref_primary_10_3390_app112110310
crossref_primary_10_3390_s20164403
crossref_primary_10_1016_j_ymssp_2020_107537
crossref_primary_10_1142_S0219519423500914
crossref_primary_10_1109_ACCESS_2021_3111223
crossref_primary_10_1109_TITS_2021_3119900
crossref_primary_10_1111_mice_13103
crossref_primary_10_1016_j_autcon_2024_105367
crossref_primary_10_1177_14759217231177314
crossref_primary_10_1080_10589759_2025_2452368
crossref_primary_10_1016_j_autcon_2022_104229
crossref_primary_10_1016_j_engappai_2024_108497
crossref_primary_10_1155_2021_1547025
crossref_primary_10_1007_s13042_023_02054_7
crossref_primary_10_1016_j_tust_2023_105428
crossref_primary_10_1063_5_0053851
crossref_primary_10_1007_s13349_024_00893_8
crossref_primary_10_1177_03611981241297985
crossref_primary_10_1109_ACCESS_2023_3312718
crossref_primary_10_1155_2023_4752072
crossref_primary_10_1007_s11042_025_20729_x
crossref_primary_10_1109_ACCESS_2024_3353729
crossref_primary_10_1109_TII_2022_3147814
crossref_primary_10_1080_10589759_2025_2459310
crossref_primary_10_1080_21681163_2021_1972342
crossref_primary_10_1109_TITS_2024_3424525
crossref_primary_10_1016_j_compeleceng_2024_109764
crossref_primary_10_1080_10298436_2022_2027414
crossref_primary_10_1080_10589759_2024_2406448
crossref_primary_10_1016_j_heliyon_2024_e25892
crossref_primary_10_1080_10589759_2023_2291429
crossref_primary_10_32604_cmes_2021_015875
crossref_primary_10_1007_s00138_020_01158_2
crossref_primary_10_3390_rs15215158
crossref_primary_10_1016_j_matcom_2025_02_003
crossref_primary_10_1111_ppa_13783
crossref_primary_10_3390_s24051542
crossref_primary_10_3390_rs14225793
crossref_primary_10_3390_info15040206
crossref_primary_10_1016_j_conbuildmat_2021_125658
crossref_primary_10_1080_17452007_2023_2244949
crossref_primary_10_1016_j_engappai_2023_107507
crossref_primary_10_1109_TIM_2024_3458059
crossref_primary_10_1016_j_aei_2025_103186
crossref_primary_10_3390_app11115074
crossref_primary_10_1155_2023_3879096
crossref_primary_10_1016_j_dsp_2025_105148
crossref_primary_10_1016_j_procs_2022_09_457
crossref_primary_10_1109_TIE_2022_3204953
crossref_primary_10_1109_TITS_2024_3464528
crossref_primary_10_1177_14759217241305537
crossref_primary_10_1016_j_inpa_2024_03_002
crossref_primary_10_1049_itr2_12173
crossref_primary_10_1109_ACCESS_2021_3069466
crossref_primary_10_1109_TIM_2023_3317386
crossref_primary_10_1016_j_engappai_2024_108300
crossref_primary_10_1016_j_eswa_2023_121686
crossref_primary_10_1155_2023_9982080
crossref_primary_10_1177_14759217241293467
crossref_primary_10_1016_j_cscm_2024_e04131
crossref_primary_10_1016_j_dibe_2022_100088
crossref_primary_10_3390_app142411541
crossref_primary_10_1002_2475_8876_12362
crossref_primary_10_1016_j_autcon_2022_104275
crossref_primary_10_3788_LOP220754
crossref_primary_10_1016_j_aei_2024_102578
crossref_primary_10_1016_j_engappai_2025_110364
crossref_primary_10_3390_s23062938
crossref_primary_10_1109_TII_2022_3233674
crossref_primary_10_1016_j_engappai_2024_108574
crossref_primary_10_1080_15325008_2024_2319325
crossref_primary_10_1080_10298436_2022_2065488
crossref_primary_10_1109_TASE_2023_3309629
crossref_primary_10_1155_2021_5298882
crossref_primary_10_1016_j_tust_2022_104472
crossref_primary_10_3390_fractalfract8080468
crossref_primary_10_1080_15732479_2022_2152840
crossref_primary_10_1016_j_autcon_2024_105614
crossref_primary_10_1016_j_autcon_2024_105612
crossref_primary_10_1061_JCCEE5_CPENG_6339
crossref_primary_10_1007_s11042_022_13152_z
crossref_primary_10_3390_s23094192
crossref_primary_10_1016_j_aei_2024_102584
crossref_primary_10_1109_TIM_2023_3342222
crossref_primary_10_1002_eng2_12872
crossref_primary_10_1007_s00530_024_01509_3
crossref_primary_10_1016_j_engappai_2022_105130
crossref_primary_10_1007_s10921_020_00715_z
crossref_primary_10_3390_rs13142665
crossref_primary_10_3390_agriculture14040591
crossref_primary_10_1016_j_autcon_2021_104017
crossref_primary_10_1016_j_autcon_2022_104412
crossref_primary_10_1061_JCCEE5_CPENG_5512
crossref_primary_10_3390_s22228714
crossref_primary_10_1016_j_autcon_2021_103831
crossref_primary_10_1002_tal_2099
crossref_primary_10_3390_s21124135
crossref_primary_10_1177_14759217221140976
crossref_primary_10_1631_jzus_A2200175
crossref_primary_10_1016_j_engappai_2025_110302
crossref_primary_10_1007_s11709_023_0965_y
crossref_primary_10_1016_j_istruc_2023_05_062
crossref_primary_10_1016_j_measurement_2023_112892
crossref_primary_10_3390_s21030824
crossref_primary_10_1177_14759217221139730
crossref_primary_10_1111_jmi_12906
crossref_primary_10_1177_14759217231168212
crossref_primary_10_1016_j_autcon_2023_105217
crossref_primary_10_3390_en16237726
crossref_primary_10_1002_suco_202400222
crossref_primary_10_1109_ACCESS_2023_3329991
crossref_primary_10_1016_j_wace_2023_100626
crossref_primary_10_1007_s11042_023_15753_8
crossref_primary_10_3390_s21165598
crossref_primary_10_1080_10298436_2023_2258438
crossref_primary_10_1177_14759217221088457
crossref_primary_10_1007_s11709_024_1071_5
crossref_primary_10_1109_TITS_2023_3275570
crossref_primary_10_1080_15732479_2021_1994617
crossref_primary_10_1049_gtd2_12756
crossref_primary_10_3390_s24134288
crossref_primary_10_1007_s11665_023_08923_0
crossref_primary_10_1109_ACCESS_2020_3037667
crossref_primary_10_3390_rs15092400
crossref_primary_10_3390_s22093341
crossref_primary_10_1177_1475921720985437
crossref_primary_10_3390_rs16224267
crossref_primary_10_1109_TITS_2023_3331769
crossref_primary_10_1016_j_autcon_2024_105646
crossref_primary_10_1016_j_tust_2024_106085
crossref_primary_10_1109_TITS_2024_3511036
crossref_primary_10_3390_s23042244
crossref_primary_10_1177_09544097231214578
crossref_primary_10_1109_TITS_2023_3301591
crossref_primary_10_1016_j_bspc_2023_105025
crossref_primary_10_1109_TIM_2025_3546391
crossref_primary_10_2174_2666255813999200918143531
crossref_primary_10_1016_j_ijtst_2023_11_005
crossref_primary_10_1016_j_istruc_2024_107073
crossref_primary_10_1049_ipr2_12940
crossref_primary_10_1109_TIM_2025_3545506
crossref_primary_10_1016_j_autcon_2024_105770
crossref_primary_10_1016_j_autcon_2025_106009
crossref_primary_10_1016_j_conbuildmat_2022_128543
crossref_primary_10_1109_TAI_2024_3366146
crossref_primary_10_1016_j_infrared_2024_105241
crossref_primary_10_1111_mice_12667
crossref_primary_10_1016_j_autcon_2024_105896
crossref_primary_10_1109_TITS_2024_3405995
crossref_primary_10_1002_eng2_12837
crossref_primary_10_1016_j_jtte_2022_11_003
crossref_primary_10_1109_ACCESS_2023_3340310
crossref_primary_10_1016_j_engfracmech_2024_110373
crossref_primary_10_1002_rob_22260
Cites_doi 10.1038/nature14539
10.1016/j.neucom.2019.01.036
10.1109/CVPR.2015.7298965
10.1080/10298436.2018.1485917
10.1109/ITSC.2017.8317714
10.1016/j.autcon.2018.12.006
10.1111/mice.12428
10.1109/CVPR.2017.195
10.1109/TPAMI.2016.2644615
10.1111/mice.12412
10.1145/3287921.3287949
10.1002/stc.2286
10.1007/978-3-030-01234-2_49
10.1109/CVPR.2017.243
10.1109/CVPR.2016.350
10.1016/j.autcon.2018.11.028
10.4231/R7ZC8111
10.1111/mice.12375
10.1109/TPAMI.2016.2577031
10.1111/mice.12334
10.1109/ICDAR.2005.251
10.1111/mice.12263
10.1111/mice.12387
10.1061/(ASCE)CP.1943-5487.0000775
10.1007/978-3-319-50835-1_22
10.1111/mice.12367
10.1109/CVPR.2009.5206848
10.1109/WACV.2017.58
10.1109/CVPR.2016.90
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
DBID 97E
RIA
RIE
AAYXX
CITATION
7SP
8FD
L7M
DOI 10.1109/TIE.2019.2945265
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Electronics & Communications Abstracts
Technology Research Database
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Technology Research Database
Advanced Technologies Database with Aerospace
Electronics & Communications Abstracts
DatabaseTitleList Technology Research Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1557-9948
EndPage 8025
ExternalDocumentID 10_1109_TIE_2019_2945265
8863123
Genre orig-research
GrantInformation_xml – fundername: Natural Sciences and Engineering Research Council of Canada
  grantid: 1262624; 533690-18
  funderid: 10.13039/501100000038
GroupedDBID -~X
.DC
0R~
29I
4.4
5GY
5VS
6IK
97E
9M8
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACKIV
ACNCT
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
HZ~
H~9
IBMZZ
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
RIA
RIE
RNS
TAE
TN5
TWZ
VH1
VJK
AAYXX
CITATION
RIG
7SP
8FD
L7M
ID FETCH-LOGICAL-c333t-847634e7f3bf3adb1847403294a66b7a38c45d5007f992db4ab0622e7cccd5053
IEDL.DBID RIE
ISSN 0278-0046
IngestDate Mon Jun 30 10:15:15 EDT 2025
Tue Jul 01 00:16:33 EDT 2025
Thu Apr 24 23:02:55 EDT 2025
Wed Aug 27 02:39:16 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 9
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c333t-847634e7f3bf3adb1847403294a66b7a38c45d5007f992db4ab0622e7cccd5053
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-0738-5615
0000-0001-9099-8334
OpenAccessLink http://hdl.handle.net/1993/35153
PQID 2401124086
PQPubID 85464
PageCount 10
ParticipantIDs ieee_primary_8863123
crossref_primary_10_1109_TIE_2019_2945265
crossref_citationtrail_10_1109_TIE_2019_2945265
proquest_journals_2401124086
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2020-09-01
PublicationDateYYYYMMDD 2020-09-01
PublicationDate_xml – month: 09
  year: 2020
  text: 2020-09-01
  day: 01
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on industrial electronics (1982)
PublicationTitleAbbrev TIE
PublicationYear 2020
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref35
ref13
ref34
ref12
ref37
ref15
ref14
howard (ref28) 2017
kingma (ref39) 2014
ref11
ref10
ref2
yosinski (ref9) 0
ref1
ronneberger (ref17) 0
ref38
ref16
srivastava (ref33) 2014; 15
ref19
ref18
goodfellow (ref3) 2016
simonyan (ref5) 2014
ref24
ref23
abadi (ref36) 0
ref26
ioffe (ref29) 2015
ref25
ref20
dumoulin (ref31) 2016
ref41
ref22
ref21
ref27
ref8
ref7
ref4
ref6
nair (ref30) 0
chen (ref32) 2017
ref40
References_xml – ident: ref2
  doi: 10.1038/nature14539
– start-page: 3320
  year: 0
  ident: ref9
  article-title: How transferable are features in deep neural networks
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref41
  doi: 10.1016/j.neucom.2019.01.036
– ident: ref37
  doi: 10.1109/CVPR.2015.7298965
– ident: ref10
  doi: 10.1080/10298436.2018.1485917
– ident: ref18
  doi: 10.1109/ITSC.2017.8317714
– ident: ref24
  doi: 10.1016/j.autcon.2018.12.006
– ident: ref19
  doi: 10.1111/mice.12428
– ident: ref7
  doi: 10.1109/CVPR.2017.195
– start-page: 807
  year: 0
  ident: ref30
  article-title: Rectified linear units improve restricted boltzmann machines
  publication-title: Proc 27th Int Conf Mach Learn
– ident: ref27
  doi: 10.1109/TPAMI.2016.2644615
– ident: ref20
  doi: 10.1111/mice.12412
– year: 2017
  ident: ref32
  article-title: Rethinking atrous convolution for semantic image segmentation
  publication-title: arXiv 1706 05587
– ident: ref11
  doi: 10.1145/3287921.3287949
– ident: ref23
  doi: 10.1002/stc.2286
– ident: ref26
  doi: 10.1007/978-3-030-01234-2_49
– ident: ref8
  doi: 10.1109/CVPR.2017.243
– ident: ref35
  doi: 10.1109/CVPR.2016.350
– ident: ref22
  doi: 10.1016/j.autcon.2018.11.028
– ident: ref34
  doi: 10.4231/R7ZC8111
– ident: ref12
  doi: 10.1111/mice.12375
– ident: ref13
  doi: 10.1109/TPAMI.2016.2577031
– ident: ref14
  doi: 10.1111/mice.12334
– ident: ref4
  doi: 10.1109/ICDAR.2005.251
– ident: ref1
  doi: 10.1111/mice.12263
– ident: ref15
  doi: 10.1111/mice.12387
– ident: ref21
  doi: 10.1061/(ASCE)CP.1943-5487.0000775
– start-page: 234
  year: 0
  ident: ref17
  article-title: U-net: Convolutional networks for biomedical image segmentation
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Int
– volume: 15
  start-page: 1929
  year: 2014
  ident: ref33
  article-title: Dropout: A simple way to prevent neural networks from overfitting
  publication-title: J Mach Learn Res
– year: 2015
  ident: ref29
  article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift
  publication-title: arXiv 1502 03167
– start-page: 265
  year: 0
  ident: ref36
  article-title: Tensorflow: A system for large-scale machine learning
  publication-title: Proc Symp Oper Syst Des Implementation
– year: 2016
  ident: ref3
  publication-title: Deep Learning
– ident: ref38
  doi: 10.1007/978-3-319-50835-1_22
– year: 2016
  ident: ref31
  article-title: A guide to convolution arithmetic for deep learning
  publication-title: ArXiv 1603 07285
– ident: ref16
  doi: 10.1111/mice.12367
– year: 2014
  ident: ref5
  article-title: Very deep convolutional networks for large-scale image recognition
  publication-title: arXiv 1409 1556
– ident: ref25
  doi: 10.1109/CVPR.2009.5206848
– year: 2017
  ident: ref28
  article-title: Mobilenets: Efficient convolutional neural networks for mobile vision applications
  publication-title: arXiv 1704 04861
– ident: ref40
  doi: 10.1109/WACV.2017.58
– ident: ref6
  doi: 10.1109/CVPR.2016.90
– year: 2014
  ident: ref39
  article-title: Adam: A method for stochastic optimization
  publication-title: arXiv 1412 6980
SSID ssj0014515
Score 2.693699
Snippet This article reports the development of a pure deep learning method for segmenting concrete cracks in images. The objectives are to achieve the real-time...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 8016
SubjectTerms Artificial neural networks
Computational efficiency
Computer architecture
Convolution
Crack segmentation
Damage detection
Decoding
deep learning (DL)
Feature extraction
Image segmentation
Machine learning
Modules
Real time
Real-time systems
separable convolution
structural health monitoring (SHM)
Title SDDNet: Real-Time Crack Segmentation
URI https://ieeexplore.ieee.org/document/8863123
https://www.proquest.com/docview/2401124086
Volume 67
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV05T8MwFH5qO8HAVRCFgjJ0QcJtiOMcbKiHClI70FbqFtmOzVBIUUkXfj3PucQlxBZFtuS8OH7fl3d8AB2DcdENxURrKglCak2CG24TptEb2kwKnolNTKbeeOE-LNmyBtdVLYxSKks-U11zmcXy47Xcml9lvSDwKJ60dagjcctrtaqIgctytQLHdIxF0leGJO2wN78fmhyusOuERlCbfXFBmabKj4M48y6jfZiU68qTSlbdbSq68v1by8b_LvwA9gqYad3l--IQaio5gt1PzQeb0JkNBlOV3lqPiBWJKQWx-hsuV9ZMPb0UFUnJMSxGw3l_TArNBCIppSlBZ-NRV_maCk15LJDA-a5N8cm55wmf00C6LGaIDHQYOrFwubA9x1G-lBJvM3oCjWSdqFOwkDj7XEgEDApZmGOADNOO9qjg0g153IJeacZIFg3Fja7Fc5QRCzuM0PCRMXxUGL4FV9WM17yZxh9jm8aO1bjChC1ol28qKr62twhRCcJGF9nZ2e-zzmHHMTw5yw1rQyPdbNUFgolUXGa76ANiRcF5
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV05T8MwFH4qZQAGroIoZ4YuSCRN4zhp2FAPtdB2oK3ULbIdm6GQopIu_HqekzTiEmKLIltyXhx_35d3AdQ0x0UYikyliDCRUiuz2WC2SRWioU0FZ2mzieHI603d-xmdleCmyIWRUqbBZ9LSl6kvP1qIlf5VVm82PYIn7QZsIu7TRpatVfgMXJr1K3B0zViUfWunpB3UJ_2OjuIKLCfQLbXpFxBKu6r8OIpTfOnuwXC9siysZG6tEm6J929FG_-79H3YzYmmcZftjAMoyfgQdj6VH6xAbdxuj2RyazwiWzR1MojRWjIxN8by6SXPSYqPYNrtTFo9M--aYApCSGIi3HjElb4iXBEWcZRwvmsTfHLmedxnpClcGlHkBioInIi7jNue40hfCIG3KTmGcryI5QkYKJ19xgVSBok6zNFUhipHeYQz4QYsqkJ9bcZQ5CXFdWeL5zCVFnYQouFDbfgwN3wVrosZr1k5jT_GVrQdi3G5Catwvn5TYf69vYXIS5A4uqjPTn-fdQVbvclwEA76o4cz2Ha0ak4jxc6hnCxX8gKpRcIv0x31Aa0rxMI
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=SDDNet%3A+Real-Time+Crack+Segmentation&rft.jtitle=IEEE+transactions+on+industrial+electronics+%281982%29&rft.au=Choi%2C+Wooram&rft.au=Young-Jin%2C+Cha&rft.date=2020-09-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0278-0046&rft.eissn=1557-9948&rft.volume=67&rft.issue=9&rft.spage=8016&rft_id=info:doi/10.1109%2FTIE.2019.2945265&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0278-0046&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0278-0046&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0278-0046&client=summon