Fully automatic segmentation on prostate MR images based on cascaded fully convolution network

Background Computer‐aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications. Clinical experience shows that prostate cancer occurs predominantly in the peripheral zone (PZ) and there exist different evaluation criter...

Full description

Saved in:
Bibliographic Details
Published inJournal of magnetic resonance imaging Vol. 49; no. 4; pp. 1149 - 1156
Main Authors Zhu, Yi, Wei, Rong, Gao, Ge, Ding, Lian, Zhang, Xiaodong, Wang, Xiaoying, Zhang, Jue
Format Journal Article
LanguageEnglish
Published United States Wiley Subscription Services, Inc 01.04.2019
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Background Computer‐aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications. Clinical experience shows that prostate cancer occurs predominantly in the peripheral zone (PZ) and there exist different evaluation criteria for different regions in the Prostate Imaging Reporting and Data System (PI‐RADS). Purpose To develop a fully automatic approach to segmenting the prostate outer contour and the PZ contour with high efficacy. Population In all, 163 subjects, including 61 healthy subjects and 102 prostate cancer patients. For each subject, all slices that contained prostate gland in diffusion‐weighted images (DWIs) and T2‐weighted images (T2WIs) were selected as the datasets. Field Strength T2‐weighted, DWI at 3.0T. Assessment The computer‐generated segmentation results were compared with the manual outlining results that were depicted by two experts with more than 5 years' experience. Dice similarity coefficient (DSC), false‐positive rate (FPR), and false‐negative rate (FNR) were used to compared the algorithm and manual segmentation results. Statistical Tests A paired t‐test was adopted for comparison between our method and classical U‐Net segmentation methods. Results The mean DSC was 92.7 ± 4.2% for the total whole prostate gland and 79.3 ± 10.4% for the total peripheral zone. Compared with classical U‐Net segmentation methods, our segmentation precision was significantly higher (P < 0.001). Data Conclusion By cropping the region of interest and cascading the two networks, our method balances the positive and negative sample gradually, and results in higher segmentation accuracy. This fully automatic strategy could provide satisfactory performance in prostate DWIs and T2WIs‐based image segmentation. Level of Evidence: 2 Technical Efficacy Stage 1 J. Magn. Reson. Imaging 2019;49:1149–1156.
AbstractList BackgroundComputer‐aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications. Clinical experience shows that prostate cancer occurs predominantly in the peripheral zone (PZ) and there exist different evaluation criteria for different regions in the Prostate Imaging Reporting and Data System (PI‐RADS).PurposeTo develop a fully automatic approach to segmenting the prostate outer contour and the PZ contour with high efficacy.PopulationIn all, 163 subjects, including 61 healthy subjects and 102 prostate cancer patients. For each subject, all slices that contained prostate gland in diffusion‐weighted images (DWIs) and T2‐weighted images (T2WIs) were selected as the datasets.Field StrengthT2‐weighted, DWI at 3.0T.AssessmentThe computer‐generated segmentation results were compared with the manual outlining results that were depicted by two experts with more than 5 years' experience. Dice similarity coefficient (DSC), false‐positive rate (FPR), and false‐negative rate (FNR) were used to compared the algorithm and manual segmentation results.Statistical TestsA paired t‐test was adopted for comparison between our method and classical U‐Net segmentation methods.ResultsThe mean DSC was 92.7 ± 4.2% for the total whole prostate gland and 79.3 ± 10.4% for the total peripheral zone. Compared with classical U‐Net segmentation methods, our segmentation precision was significantly higher (P < 0.001).Data ConclusionBy cropping the region of interest and cascading the two networks, our method balances the positive and negative sample gradually, and results in higher segmentation accuracy. This fully automatic strategy could provide satisfactory performance in prostate DWIs and T2WIs‐based image segmentation.Level of Evidence: 2Technical Efficacy Stage 1J. Magn. Reson. Imaging 2019;49:1149–1156.
Computer-aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications. Clinical experience shows that prostate cancer occurs predominantly in the peripheral zone (PZ) and there exist different evaluation criteria for different regions in the Prostate Imaging Reporting and Data System (PI-RADS).BACKGROUNDComputer-aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications. Clinical experience shows that prostate cancer occurs predominantly in the peripheral zone (PZ) and there exist different evaluation criteria for different regions in the Prostate Imaging Reporting and Data System (PI-RADS).To develop a fully automatic approach to segmenting the prostate outer contour and the PZ contour with high efficacy.PURPOSETo develop a fully automatic approach to segmenting the prostate outer contour and the PZ contour with high efficacy.In all, 163 subjects, including 61 healthy subjects and 102 prostate cancer patients. For each subject, all slices that contained prostate gland in diffusion-weighted images (DWIs) and T2 -weighted images (T2 WIs) were selected as the datasets.POPULATIONIn all, 163 subjects, including 61 healthy subjects and 102 prostate cancer patients. For each subject, all slices that contained prostate gland in diffusion-weighted images (DWIs) and T2 -weighted images (T2 WIs) were selected as the datasets.T2 -weighted, DWI at 3.0T.FIELD STRENGTHT2 -weighted, DWI at 3.0T.The computer-generated segmentation results were compared with the manual outlining results that were depicted by two experts with more than 5 years' experience. Dice similarity coefficient (DSC), false-positive rate (FPR), and false-negative rate (FNR) were used to compared the algorithm and manual segmentation results.ASSESSMENTThe computer-generated segmentation results were compared with the manual outlining results that were depicted by two experts with more than 5 years' experience. Dice similarity coefficient (DSC), false-positive rate (FPR), and false-negative rate (FNR) were used to compared the algorithm and manual segmentation results.A paired t-test was adopted for comparison between our method and classical U-Net segmentation methods.STATISTICAL TESTSA paired t-test was adopted for comparison between our method and classical U-Net segmentation methods.The mean DSC was 92.7 ± 4.2% for the total whole prostate gland and 79.3 ± 10.4% for the total peripheral zone. Compared with classical U-Net segmentation methods, our segmentation precision was significantly higher (P < 0.001).RESULTSThe mean DSC was 92.7 ± 4.2% for the total whole prostate gland and 79.3 ± 10.4% for the total peripheral zone. Compared with classical U-Net segmentation methods, our segmentation precision was significantly higher (P < 0.001).By cropping the region of interest and cascading the two networks, our method balances the positive and negative sample gradually, and results in higher segmentation accuracy. This fully automatic strategy could provide satisfactory performance in prostate DWIs and T2 WIs-based image segmentation.DATA CONCLUSIONBy cropping the region of interest and cascading the two networks, our method balances the positive and negative sample gradually, and results in higher segmentation accuracy. This fully automatic strategy could provide satisfactory performance in prostate DWIs and T2 WIs-based image segmentation.2 Technical Efficacy Stage 1 J. Magn. Reson. Imaging 2019;49:1149-1156.LEVEL OF EVIDENCE2 Technical Efficacy Stage 1 J. Magn. Reson. Imaging 2019;49:1149-1156.
Background Computer‐aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications. Clinical experience shows that prostate cancer occurs predominantly in the peripheral zone (PZ) and there exist different evaluation criteria for different regions in the Prostate Imaging Reporting and Data System (PI‐RADS). Purpose To develop a fully automatic approach to segmenting the prostate outer contour and the PZ contour with high efficacy. Population In all, 163 subjects, including 61 healthy subjects and 102 prostate cancer patients. For each subject, all slices that contained prostate gland in diffusion‐weighted images (DWIs) and T2‐weighted images (T2WIs) were selected as the datasets. Field Strength T2‐weighted, DWI at 3.0T. Assessment The computer‐generated segmentation results were compared with the manual outlining results that were depicted by two experts with more than 5 years' experience. Dice similarity coefficient (DSC), false‐positive rate (FPR), and false‐negative rate (FNR) were used to compared the algorithm and manual segmentation results. Statistical Tests A paired t‐test was adopted for comparison between our method and classical U‐Net segmentation methods. Results The mean DSC was 92.7 ± 4.2% for the total whole prostate gland and 79.3 ± 10.4% for the total peripheral zone. Compared with classical U‐Net segmentation methods, our segmentation precision was significantly higher (P < 0.001). Data Conclusion By cropping the region of interest and cascading the two networks, our method balances the positive and negative sample gradually, and results in higher segmentation accuracy. This fully automatic strategy could provide satisfactory performance in prostate DWIs and T2WIs‐based image segmentation. Level of Evidence: 2 Technical Efficacy Stage 1 J. Magn. Reson. Imaging 2019;49:1149–1156.
Computer-aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications. Clinical experience shows that prostate cancer occurs predominantly in the peripheral zone (PZ) and there exist different evaluation criteria for different regions in the Prostate Imaging Reporting and Data System (PI-RADS). To develop a fully automatic approach to segmenting the prostate outer contour and the PZ contour with high efficacy. In all, 163 subjects, including 61 healthy subjects and 102 prostate cancer patients. For each subject, all slices that contained prostate gland in diffusion-weighted images (DWIs) and T -weighted images (T WIs) were selected as the datasets. T -weighted, DWI at 3.0T. The computer-generated segmentation results were compared with the manual outlining results that were depicted by two experts with more than 5 years' experience. Dice similarity coefficient (DSC), false-positive rate (FPR), and false-negative rate (FNR) were used to compared the algorithm and manual segmentation results. A paired t-test was adopted for comparison between our method and classical U-Net segmentation methods. The mean DSC was 92.7 ± 4.2% for the total whole prostate gland and 79.3 ± 10.4% for the total peripheral zone. Compared with classical U-Net segmentation methods, our segmentation precision was significantly higher (P < 0.001). By cropping the region of interest and cascading the two networks, our method balances the positive and negative sample gradually, and results in higher segmentation accuracy. This fully automatic strategy could provide satisfactory performance in prostate DWIs and T WIs-based image segmentation. 2 Technical Efficacy Stage 1 J. Magn. Reson. Imaging 2019;49:1149-1156.
Author Wei, Rong
Zhang, Xiaodong
Wang, Xiaoying
Zhu, Yi
Gao, Ge
Zhang, Jue
Ding, Lian
Author_xml – sequence: 1
  givenname: Yi
  surname: Zhu
  fullname: Zhu, Yi
  organization: Peking University
– sequence: 2
  givenname: Rong
  surname: Wei
  fullname: Wei, Rong
  organization: Peking University
– sequence: 3
  givenname: Ge
  surname: Gao
  fullname: Gao, Ge
  organization: Peking University First Hospital
– sequence: 4
  givenname: Lian
  surname: Ding
  fullname: Ding, Lian
  organization: Peking University
– sequence: 5
  givenname: Xiaodong
  surname: Zhang
  fullname: Zhang, Xiaodong
  organization: Peking University First Hospital
– sequence: 6
  givenname: Xiaoying
  surname: Wang
  fullname: Wang, Xiaoying
  email: cjr.wangxiaoying@vip.163.com
  organization: Peking University First Hospital
– sequence: 7
  givenname: Jue
  orcidid: 0000-0003-0440-1357
  surname: Zhang
  fullname: Zhang, Jue
  email: zhangjue@pku.edu.cn
  organization: Peking University
BackLink https://www.ncbi.nlm.nih.gov/pubmed/30350434$$D View this record in MEDLINE/PubMed
BookMark eNp9kVtLAzEQhYNUbK2--ANkwRcRVnPdy6MUq5WKIPpqyGUqW3c3dbOr9N-bdvWliBDIDPnOYXLmEA1qVwNCJwRfEozp1bJqikuaMJbuoRERlMZUZMkg1FiwmGQ4HaJD75cY4zzn4gANGWYCc8ZH6HXaleU6Ul3rKtUWJvLwVkHdhtrVUTirxvnQQfTwFBWVegMfaeXBbt6M8kbZUC-2JsbVn67stsoa2i_XvB-h_YUqPRz_3GP0Mr15ntzF88fb2eR6Hhsm0jSm1CpiMeNE51SbhBNjLM8gU4khRGvOQy8WKTBhcmKtFcB0qjTXTCcaLBuj8943jPvRgW9lVXgDZalqcJ2XlKQ5pUTwPKBnO-jSdU0dpgtUjrOQTSICdfpDdboCK1dN-Hyzlr_JBQD3gAkB-QYW0hR9am2jilISLDfLkZvlyO1yguRiR_Lr-idMevirKGH9DynvH55mveYb3WGguA
CitedBy_id crossref_primary_10_1007_s44163_024_00162_z
crossref_primary_10_1002_jmri_28608
crossref_primary_10_1016_j_dsp_2019_102649
crossref_primary_10_1007_s00330_021_08408_5
crossref_primary_10_1186_s13244_022_01340_2
crossref_primary_10_1109_TMI_2022_3197180
crossref_primary_10_3390_ijerph17134789
crossref_primary_10_1016_j_diii_2023_08_001
crossref_primary_10_1148_rycan_2021200024
crossref_primary_10_1109_ACCESS_2022_3232561
crossref_primary_10_1186_s12916_024_03742_z
crossref_primary_10_1186_s13244_023_01421_w
crossref_primary_10_3390_electronics9081199
crossref_primary_10_1088_1361_6560_ac02d3
crossref_primary_10_3390_cancers13030552
crossref_primary_10_1088_1742_6596_1748_4_042058
crossref_primary_10_1016_j_acra_2023_09_035
crossref_primary_10_1155_2020_8861035
crossref_primary_10_3390_cancers12051204
crossref_primary_10_1016_j_heliyon_2023_e16810
crossref_primary_10_1016_j_softx_2019_100347
crossref_primary_10_3390_s21082709
crossref_primary_10_1016_j_neucom_2022_07_070
crossref_primary_10_1109_JBHI_2023_3289913
crossref_primary_10_1016_j_bspc_2024_106187
crossref_primary_10_1109_ACCESS_2021_3090825
crossref_primary_10_1016_j_ajodo_2020_05_017
crossref_primary_10_1148_ryai_230138
crossref_primary_10_1002_mp_16343
crossref_primary_10_1007_s10334_022_01031_5
crossref_primary_10_1109_ACCESS_2023_3338746
crossref_primary_10_1002_mp_14517
crossref_primary_10_1002_acm2_14244
crossref_primary_10_1002_mrm_28257
crossref_primary_10_1109_ACCESS_2023_3313420
crossref_primary_10_3389_fonc_2022_958065
crossref_primary_10_1016_j_neunet_2024_106782
crossref_primary_10_1007_s00261_022_03583_5
crossref_primary_10_1186_s12880_021_00703_3
crossref_primary_10_1016_j_clinimag_2022_04_007
crossref_primary_10_1186_s13244_025_01898_7
crossref_primary_10_1142_S0219467822500310
crossref_primary_10_1055_a_1192_9305
crossref_primary_10_3390_app11020844
crossref_primary_10_3390_diagnostics12020289
crossref_primary_10_3390_ma13122798
crossref_primary_10_3389_fonc_2023_1095353
crossref_primary_10_3390_diagnostics11111964
crossref_primary_10_17650_1726_9776_2023_19_2_101_110
crossref_primary_10_1007_s00261_024_04241_8
crossref_primary_10_1186_s13244_021_01044_z
crossref_primary_10_1067_j_cpradiol_2021_06_006
crossref_primary_10_17341_gazimmfd_1153507
crossref_primary_10_1038_s41598_020_71080_0
crossref_primary_10_23736_S2723_9284_23_00255_0
crossref_primary_10_1016_j_ijmedinf_2023_105279
crossref_primary_10_1007_s11912_023_01371_y
crossref_primary_10_1016_j_ejrad_2023_110887
crossref_primary_10_1053_j_sodo_2021_05_002
crossref_primary_10_1007_s11042_021_11044_2
crossref_primary_10_1016_j_bbe_2020_07_011
crossref_primary_10_1109_ACCESS_2023_3268576
crossref_primary_10_3390_app10072601
crossref_primary_10_3390_cancers15082335
crossref_primary_10_1016_j_clinimag_2020_10_014
crossref_primary_10_1109_JBHI_2024_3384970
crossref_primary_10_1007_s11042_019_07934_1
crossref_primary_10_1002_jmri_27565
crossref_primary_10_1002_mp_14022
crossref_primary_10_3390_cancers16101809
crossref_primary_10_1007_s00330_019_06467_3
crossref_primary_10_1002_ima_22744
crossref_primary_10_1002_ima_70060
crossref_primary_10_1038_s41598_022_06730_6
crossref_primary_10_1007_s11427_019_1556_7
crossref_primary_10_1016_j_ejrad_2019_108716
crossref_primary_10_3389_fonc_2021_773299
crossref_primary_10_3233_JIFS_210393
crossref_primary_10_2174_1874061802006010001
crossref_primary_10_3389_fmed_2021_810995
Cites_doi 10.1118/1.3315367
10.1046/j.1464-410X.2002.2822.x
10.1007/978-3-642-40763-5_32
10.1117/1.JMI.4.4.041307
10.1109/TMI.2010.2057442
10.1016/j.eururo.2015.08.052
10.1016/j.eururo.2015.08.038
10.1016/j.media.2014.02.009
10.3322/caac.21208
10.1007/s11427-016-0389-9
10.1007/s11427-015-4876-6
10.1097/MOU.0b013e328329a2ed
10.1148/radiol.11091822
10.1002/pros.2990120107
10.1109/TMI.2015.2496296
10.1109/3DV.2016.79
10.1016/S1001-9294(09)60031-6
10.1007/978-3-319-24574-4_28
10.1118/1.2842076
10.1109/TMI.2012.2201498
ContentType Journal Article
Copyright 2018 International Society for Magnetic Resonance in Medicine
2018 International Society for Magnetic Resonance in Medicine.
2019 International Society for Magnetic Resonance in Medicine
Copyright_xml – notice: 2018 International Society for Magnetic Resonance in Medicine
– notice: 2018 International Society for Magnetic Resonance in Medicine.
– notice: 2019 International Society for Magnetic Resonance in Medicine
DBID AAYXX
CITATION
NPM
7QO
7TK
8FD
FR3
K9.
P64
7X8
DOI 10.1002/jmri.26337
DatabaseName CrossRef
PubMed
Biotechnology Research Abstracts
Neurosciences Abstracts
Technology Research Database
Engineering Research Database
ProQuest Health & Medical Complete (Alumni)
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
ProQuest Health & Medical Complete (Alumni)
Engineering Research Database
Biotechnology Research Abstracts
Technology Research Database
Neurosciences Abstracts
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DatabaseTitleList ProQuest Health & Medical Complete (Alumni)
MEDLINE - Academic

PubMed
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
EISSN 1522-2586
EndPage 1156
ExternalDocumentID 30350434
10_1002_jmri_26337
JMRI26337
Genre article
Research Support, Non-U.S. Gov't
Journal Article
GroupedDBID ---
-DZ
.3N
.GA
.GJ
.Y3
05W
0R~
10A
1L6
1OB
1OC
1ZS
31~
33P
3O-
3SF
3WU
4.4
4ZD
50Y
50Z
51W
51X
52M
52N
52O
52P
52R
52S
52T
52U
52V
52W
52X
53G
5GY
5RE
5VS
66C
702
7PT
8-0
8-1
8-3
8-4
8-5
8UM
930
A01
A03
AAESR
AAEVG
AAHQN
AAIPD
AAMMB
AAMNL
AANHP
AANLZ
AAONW
AASGY
AAWTL
AAXRX
AAYCA
AAZKR
ABCQN
ABCUV
ABEML
ABIJN
ABJNI
ABLJU
ABOCM
ABPVW
ABQWH
ABXGK
ACAHQ
ACBWZ
ACCZN
ACGFO
ACGFS
ACGOF
ACIWK
ACMXC
ACPOU
ACPRK
ACRPL
ACSCC
ACXBN
ACXQS
ACYXJ
ADBBV
ADBTR
ADEOM
ADIZJ
ADKYN
ADMGS
ADNMO
ADOZA
ADXAS
ADZMN
AEFGJ
AEGXH
AEIGN
AEIMD
AENEX
AEUYR
AEYWJ
AFBPY
AFFPM
AFGKR
AFRAH
AFWVQ
AFZJQ
AGHNM
AGQPQ
AGXDD
AGYGG
AHBTC
AHMBA
AIACR
AIAGR
AIDQK
AIDYY
AITYG
AIURR
ALAGY
ALMA_UNASSIGNED_HOLDINGS
ALUQN
ALVPJ
AMBMR
AMYDB
ASPBG
ATUGU
AVWKF
AZBYB
AZFZN
AZVAB
BAFTC
BDRZF
BFHJK
BHBCM
BMXJE
BROTX
BRXPI
BY8
C45
CS3
D-6
D-7
D-E
D-F
DCZOG
DPXWK
DR2
DRFUL
DRMAN
DRSTM
DU5
EBD
EBS
EJD
EMOBN
F00
F01
F04
F5P
FEDTE
FUBAC
G-S
G.N
GNP
GODZA
H.X
HBH
HDBZQ
HF~
HGLYW
HHY
HHZ
HVGLF
HZ~
IX1
J0M
JPC
KBYEO
KQQ
LATKE
LAW
LC2
LC3
LEEKS
LH4
LITHE
LOXES
LP6
LP7
LUTES
LW6
LYRES
M65
MEWTI
MK4
MRFUL
MRMAN
MRSTM
MSFUL
MSMAN
MSSTM
MXFUL
MXMAN
MXSTM
N04
N05
N9A
NF~
NNB
O66
O9-
OIG
OVD
P2P
P2W
P2X
P2Z
P4B
P4D
PALCI
PQQKQ
Q.N
Q11
QB0
QRW
R.K
RIWAO
RJQFR
ROL
RX1
RYL
SAMSI
SUPJJ
SV3
TEORI
TWZ
UB1
V2E
V8K
V9Y
W8V
W99
WBKPD
WHWMO
WIB
WIH
WIJ
WIK
WIN
WJL
WOHZO
WQJ
WVDHM
WXI
WXSBR
XG1
XV2
ZXP
ZZTAW
~IA
~WT
AAHHS
AAYXX
ACCFJ
AEEZP
AEQDE
AIWBW
AJBDE
CITATION
24P
AEUQT
AFPWT
NPM
RGB
RWI
WRC
WUP
7QO
7TK
8FD
FR3
K9.
P64
7X8
ID FETCH-LOGICAL-c3577-22da1d0341b92bc641ccd48e8a6c11bb44ccd5f7e35c91ddd5e3b7ab4b3b6bed3
IEDL.DBID DR2
ISSN 1053-1807
1522-2586
IngestDate Fri Jul 11 03:00:19 EDT 2025
Fri Jul 25 12:05:15 EDT 2025
Wed Feb 19 02:31:28 EST 2025
Tue Jul 01 03:56:40 EDT 2025
Thu Apr 24 23:01:36 EDT 2025
Wed Aug 20 07:27:09 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 4
Keywords prostatic peripheral zone
fully automatic segmentation
the ROI of prostate
cascaded fully convolutional network
Language English
License 2018 International Society for Magnetic Resonance in Medicine.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c3577-22da1d0341b92bc641ccd48e8a6c11bb44ccd5f7e35c91ddd5e3b7ab4b3b6bed3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0003-0440-1357
PMID 30350434
PQID 2190830365
PQPubID 1006400
PageCount 8
ParticipantIDs proquest_miscellaneous_2179221549
proquest_journals_2190830365
pubmed_primary_30350434
crossref_citationtrail_10_1002_jmri_26337
crossref_primary_10_1002_jmri_26337
wiley_primary_10_1002_jmri_26337_JMRI26337
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate April 2019
2019-04-00
20190401
PublicationDateYYYYMMDD 2019-04-01
PublicationDate_xml – month: 04
  year: 2019
  text: April 2019
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Nashville
PublicationSubtitle JMRI
PublicationTitle Journal of magnetic resonance imaging
PublicationTitleAlternate J Magn Reson Imaging
PublicationYear 2019
Publisher Wiley Subscription Services, Inc
Publisher_xml – name: Wiley Subscription Services, Inc
References 2015; 58
2010; 37
2017; 60
2017; 4
2010; 29
1988; 12
2008; 23
2017
2008; 35
2016
2015
2002; 90
2016; 2014
2014; 18
2013
2009; 19
2011; 261
2016; 69
2012; 31
2016; 35
2014; 64
e_1_2_7_6_1
Long J (e_1_2_7_17_1) 2015
Yu L (e_1_2_7_20_1) 2017
e_1_2_7_5_1
e_1_2_7_4_1
e_1_2_7_3_1
e_1_2_7_9_1
e_1_2_7_8_1
e_1_2_7_19_1
e_1_2_7_18_1
Christ PF (e_1_2_7_33_1) 2016
e_1_2_7_2_1
e_1_2_7_15_1
e_1_2_7_14_1
e_1_2_7_13_1
e_1_2_7_12_1
e_1_2_7_11_1
e_1_2_7_26_1
e_1_2_7_27_1
e_1_2_7_28_1
e_1_2_7_29_1
e_1_2_7_30_1
e_1_2_7_31_1
e_1_2_7_24_1
e_1_2_7_32_1
e_1_2_7_23_1
e_1_2_7_21_1
Yu L (e_1_2_7_10_1) 2017
Wang S (e_1_2_7_7_1) 2016; 2014
Cheng R (e_1_2_7_16_1) 2016
Khalvati F (e_1_2_7_25_1) 2017
Zhu Q (e_1_2_7_22_1) 2017
References_xml – start-page: 234
  year: 2015
  end-page: 241
– volume: 37
  start-page: 1579
  year: 2010
  end-page: 1590
  article-title: Automated segmentation of the prostate in 3D MR images using a probabilistic atlas and a spatially constrained deformable model
  publication-title: Med Phys
– volume: 19
  start-page: 274
  year: 2009
  article-title: Current status of MRI for the diagnosis, staging and prognosis of prostate cancer: implications for focal therapy and active surveillance
  publication-title: Curr Opin Urol
– year: 2017
  article-title: Volumetric ConvNets with mixed residual connections for automated prostate segmentation from 3D MR images
  publication-title: AAAI
– volume: 90
  start-page: 162
  year: 2002
  end-page: 173
  article-title: Patterns and trends in prostate cancer incidence, survival, prevalence and mortality. Part I: international comparisons
  publication-title: BJU Int
– volume: 60
  start-page: 37
  year: 2017
  end-page: 43
  article-title: Quantitative analysis of diffusion‐weighted magnetic resonance images: differentiation between prostate cancer and normal tissue based on a computer‐aided diagnosis system
  publication-title: Sci China Life Sci
– start-page: 565
  year: 2016
  end-page: 571
– start-page: 3431
  year: 2015
  end-page: 3440
  article-title: Fully convolutional networks for semantic segmentation. Computer vision and pattern recognition
  publication-title: IEEE
– volume: 35
  start-page: 791
  year: 2016
  end-page: 801
  article-title: Superpixel‐based segmentation for 3D prostate MR images
  publication-title: IEEE Trans Med Imaging
– volume: 29
  start-page: 2000
  year: 2010
  article-title: Label fusion in atlas‐based segmentation using a selective and iterative method for performance level estimation (SIMPLE)
  publication-title: IEEE Trans Med Imaging
– volume: 58
  start-page: 666
  year: 2015
  article-title: Prostate cancer identification: quantitative analysis of T2‐weighted MR images based on a back propagation artificial neural network model
  publication-title: Sci China Life Sci
– volume: 64
  start-page: 9
  year: 2014
  article-title: Cancer statistics, 2014
  publication-title: CA Cancer J Clin
– start-page: 415
  year: 2016
  end-page: 423
  publication-title: Automatic liver and lesion segmentation in CT using cascaded fully convolutional neural networks and 3D conditional random fields
– start-page: 254
  year: 2013
  end-page: 261
– volume: 261
  start-page: 46
  year: 2011
  end-page: 66
  article-title: Prostate cancer: multiparametric MR imaging for detection, localization, and staging
  publication-title: Radiology
– volume: 35
  start-page: 1407
  year: 2008
  end-page: 1417
  article-title: Automatic segmentation of the prostate in 3D MR images by atlas matching using localized mutual information
  publication-title: Med Phys
– volume: 2014
  start-page: 789561
  year: 2016
  article-title: Computer aided‐diagnosis of prostate cancer on multiparametric MRI: a technical review of current research
  publication-title: Biomed Res Int
– start-page: 178
  year: 2017
  end-page: 184
– volume: 69
  start-page: 16
  year: 2016
  end-page: 40
  article-title: PI‐RADS prostate imaging‐reporting and data system: 2015, version 2
  publication-title: Eur Urol
– volume: 18
  start-page: 660
  year: 2014
  end-page: 673
  article-title: Dual optimization based prostate zonal segmentation in 3D MR images
  publication-title: Med Image Anal
– start-page: 97842I
  year: 2016
  article-title: Active appearance model and deep learning for more accurate prostate segmentation on MRI
  publication-title: Medical Imaging 2016: Image Processing
– volume: 31
  start-page: 1638
  year: 2012
  end-page: 1650
  article-title: Multifeature landmark‐free active appearance models: application to prostate MRI segmentation
  publication-title: IEEE Trans Med Imaging
– volume: 23
  start-page: 158
  year: 2008
  end-page: 161
  article-title: Normal appearance of large field diffusion weighted imaging on 3.0 T MRI
  publication-title: Chin Med Sci J
– volume: 69
  start-page: 41
  year: 2016
  end-page: 49
  article-title: Synopsis of the PI‐RADS v2 guidelines for multiparametric prostate magnetic resonance imaging and recommendations for use
  publication-title: Eur Urol
– volume: 12
  start-page: 47
  year: 1988
  end-page: 53
  article-title: Tissue type plasminogen activator as a marker for functional zones, within the human prostate gland
  publication-title: Prostate
– year: 2017
– volume: 4
  start-page: 041307
  year: 2017
  article-title: Fully automated segmentation of prostate whole gland and transition zone in diffusion‐weighted MRI using convolutional neural networks
  publication-title: J Med Imaging
– ident: e_1_2_7_23_1
  doi: 10.1118/1.3315367
– ident: e_1_2_7_4_1
  doi: 10.1046/j.1464-410X.2002.2822.x
– ident: e_1_2_7_15_1
  doi: 10.1007/978-3-642-40763-5_32
– ident: e_1_2_7_9_1
  doi: 10.1117/1.JMI.4.4.041307
– ident: e_1_2_7_29_1
– ident: e_1_2_7_12_1
  doi: 10.1109/TMI.2010.2057442
– ident: e_1_2_7_31_1
  doi: 10.1016/j.eururo.2015.08.052
– volume-title: Proceedings of 15th Imaging Network Ontario (ImNO) Symposium
  year: 2017
  ident: e_1_2_7_25_1
– ident: e_1_2_7_28_1
  doi: 10.1016/j.eururo.2015.08.038
– start-page: 178
  volume-title: International Joint Conference on Neural Networks
  year: 2017
  ident: e_1_2_7_22_1
– ident: e_1_2_7_21_1
  doi: 10.1117/1.JMI.4.4.041307
– year: 2017
  ident: e_1_2_7_20_1
  article-title: Volumetric ConvNets with mixed residual connections for automated prostate segmentation from 3D MR images
  publication-title: AAAI
– ident: e_1_2_7_24_1
  doi: 10.1016/j.media.2014.02.009
– ident: e_1_2_7_3_1
  doi: 10.3322/caac.21208
– start-page: 97842I
  year: 2016
  ident: e_1_2_7_16_1
  article-title: Active appearance model and deep learning for more accurate prostate segmentation on MRI
  publication-title: Medical Imaging 2016: Image Processing
– ident: e_1_2_7_2_1
– ident: e_1_2_7_27_1
  doi: 10.1007/s11427-016-0389-9
– ident: e_1_2_7_8_1
  doi: 10.1007/s11427-015-4876-6
– start-page: 415
  year: 2016
  ident: e_1_2_7_33_1
  publication-title: Automatic liver and lesion segmentation in CT using cascaded fully convolutional neural networks and 3D conditional random fields
– volume: 2014
  start-page: 789561
  year: 2016
  ident: e_1_2_7_7_1
  article-title: Computer aided‐diagnosis of prostate cancer on multiparametric MRI: a technical review of current research
  publication-title: Biomed Res Int
– ident: e_1_2_7_5_1
  doi: 10.1097/MOU.0b013e328329a2ed
– ident: e_1_2_7_6_1
  doi: 10.1148/radiol.11091822
– ident: e_1_2_7_30_1
  doi: 10.1002/pros.2990120107
– start-page: 3431
  year: 2015
  ident: e_1_2_7_17_1
  article-title: Fully convolutional networks for semantic segmentation. Computer vision and pattern recognition
  publication-title: IEEE
– ident: e_1_2_7_14_1
  doi: 10.1109/TMI.2015.2496296
– ident: e_1_2_7_19_1
  doi: 10.1109/3DV.2016.79
– ident: e_1_2_7_32_1
  doi: 10.1016/S1001-9294(09)60031-6
– ident: e_1_2_7_18_1
  doi: 10.1007/978-3-319-24574-4_28
– ident: e_1_2_7_26_1
  doi: 10.1002/pros.2990120107
– ident: e_1_2_7_11_1
  doi: 10.1118/1.2842076
– year: 2017
  ident: e_1_2_7_10_1
  article-title: Volumetric ConvNets with mixed residual connections for automated prostate segmentation from 3D MR images
  publication-title: AAAI
– ident: e_1_2_7_13_1
  doi: 10.1109/TMI.2012.2201498
SSID ssj0009945
Score 2.5438612
Snippet Background Computer‐aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD...
Computer-aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications. Clinical...
BackgroundComputer‐aided diagnosis (CAD) can aid radiologists in quantifying prostate cancer, and MRI segmentation plays an essential role in CAD applications....
SourceID proquest
pubmed
crossref
wiley
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1149
SubjectTerms Algorithms
Cancer
cascaded fully convolutional network
Contours
Convolution
Field strength
fully automatic segmentation
Image processing
Image segmentation
Magnetic resonance imaging
Medical imaging
Population (statistical)
Prostate cancer
prostatic peripheral zone
Shape
Statistical analysis
Statistical tests
the ROI of prostate
Title Fully automatic segmentation on prostate MR images based on cascaded fully convolution network
URI https://onlinelibrary.wiley.com/doi/abs/10.1002%2Fjmri.26337
https://www.ncbi.nlm.nih.gov/pubmed/30350434
https://www.proquest.com/docview/2190830365
https://www.proquest.com/docview/2179221549
Volume 49
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3dS9xAEB_EB_HFj2prqpYt-mIh52U_8gG-SKlY4Xw4FHyxIfsRUXs5MXcP7V_vzCaXQ1sEhSQk7GY3yc7u_DY78xuAfYtqzUrtQqkjPKhUh6lBWY44LyVOxfq89Gyf5_HppTy7UlcLcDTzhWn4IbofbtQz_HhNHbzQ9eGcNPRu9Hjb47EQ5EpOxlqEiIZz7qgs8xGKET-IMEr7ScdNyg_ntz7XRv9AzOeI1auck1W4nj1sY2ly35tOdM_8fcHj-N63WYOVFouy40Z41mHBVR9gadCutm_AL5qf_mHFdDL2xK6sdjej1lepYrg9kMsIglU2GLLbEY5MNSOtaCnNFDWZ3ltW-kLIur2VclY1puebcHny4-L7adjGYwiNUEkScm6LyPZR7-mMaxPLyBgrU5cWsYkiraXEa1UmTiiTRdZa5YROCi210LF2VnyExWpcuS1gLo6Vw8Jw-EhxtymiSkXxI9OyTJIiC-Bg1i65acnKKWbG77yhWeY5fbDcf7AA9rq8Dw1Fx39z7cyaN2-7aZ3jcI0QFJW4CuBrl4wdjFZNisqNp5QnyTgnJrsAPjVi0VUjaF1WChnAN9-4r9Sfnw2GP_3Z57dk3oZlhGhZYyu0A4uTx6nbRRg00V-8uD8BofUEGA
linkProvider Wiley-Blackwell
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwEB5BkUovPAotgQKu4FKkbDd-xMkRAdX2sT2sWqmnRvEjqMBmq2b3AL-eGSfNqrRCAimJEsWxFXs889kefwPw3qFZc9L4WJoELyozcWZRlhPOK4lDsSGvAtvncTo6lQdn6qzzzaG9MC0_RD_hRj0j6Gvq4DQhvbtkDf02vboY8FQIfR8eUEhvos7_PFmyR-V5iFGMCELESTbUPTsp311-e9Me3QKZNzFrMDp7j9vIqk3gKiRfk--DxdwM7K8_mBz_-3-ewKMOjrKPrfw8hXu-XofVcbfg_gzOaYj6k5WL-Sxwu7LGf51225Vqhscl7RpBvMrGE3YxReXUMDKMjt7ZsiHve8eqkAk5uHeCzurW-_w5nO59Ofk0iruQDLEVSuuYc1cmboimz-Tc2FQm1jqZ-axMbZIYIyU-q0p7oWyeOOeUF0aXRhphUuOd2ICVelb7F8B8miqPmaEGyfB0GQJLRSEks6rSuswj2LlumMJ2fOUUNuNH0TIt84IqrAgVFsG7Pu1ly9JxZ6qt6_Ytup7aFKixEYWiHVcRbPevsY_RwklZ-9mC0uiccyKzi2CzlYu-GEFLs1LICD6E1v1L-cXBeLIf7l7-S-K38HB0Mj4qjvaPD1_BGiK2vHUd2oKV-dXCv0ZUNDdvguz_BtUHCDQ
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1bT9swFD7iIiFexrhtGQyM2MsmpTS-xIm0l2lQcSuaKpB4GVF8ycS2phVtH-DX79hJU8EQ0iYlUSI7dhKf4_M5Pv4OwAeDZs1wZUOuIjyIRIWJRlmOKC04DsXatPBsnxfx8RU_vRbXc_B5uham4odofrg5zfD9tVPwoSkOZqShP_t3ty0aMybnYZHH7dQFbjjszcij0tSHKEYAwcIoacuGnJQezO59bI7-wpiPIau3OZ0V-D592srV5FdrMlYt_fCEyPF_X-c1vKrBKPlSSc8qzNlyDZa69XT7Oty4Aeo9ySfjgWd2JSP7o18vVioJbkO3ZgTRKun2yG0fu6YRcWbRuDSdj5zvvSGFL8S5t9diTsrK93wDrjpHl1-PwzogQ6iZkDKk1OSRaaPhUylVOuaR1oYnNsljHUVKcY7XopCWCZ1GxhhhmZK54oqpWFnDNmGhHJT2LRAbx8JiYdh_JLibBGGlcAEkk6KQMk8D-Dhtl0zXbOUuaMbvrOJZppn7YJn_YAHsN3mHFUfHs7m2p82b1Xo6yrC_RgyKVlwEsNcko4a5aZO8tIOJyyNTSh2VXQBvKrFoqmFuYpYzHsAn37gv1J-ddnsn_uzdv2TehaVvh53s_OTibAuWEa6lld_QNiyM7yb2PUKisdrxkv8HvowG4w
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Fully+automatic+segmentation+on+prostate+MR+images+based+on+cascaded+fully+convolution+network&rft.jtitle=Journal+of+magnetic+resonance+imaging&rft.au=Zhu%2C+Yi&rft.au=Wei%2C+Rong&rft.au=Gao%2C+Ge&rft.au=Ding%2C+Lian&rft.date=2019-04-01&rft.issn=1053-1807&rft.eissn=1522-2586&rft.volume=49&rft.issue=4&rft.spage=1149&rft.epage=1156&rft_id=info:doi/10.1002%2Fjmri.26337&rft.externalDBID=n%2Fa&rft.externalDocID=10_1002_jmri_26337
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1053-1807&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1053-1807&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1053-1807&client=summon