Deformable Image Registration Using a Cue-Aware Deep Regression Network

Significance: Analysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse nature. Objective: We propose a novel deformable registration method, which is based on a cue-aware deep regression network, to deal with multiple...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on biomedical engineering Vol. 65; no. 9; pp. 1900 - 1911
Main Authors Cao, Xiaohuan, Yang, Jianhua, Zhang, Jun, Wang, Qian, Yap, Pew-Thian, Shen, Dinggang
Format Journal Article
LanguageEnglish
Published United States IEEE 01.09.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0018-9294
1558-2531
1558-2531
DOI10.1109/TBME.2018.2822826

Cover

Loading…
Abstract Significance: Analysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse nature. Objective: We propose a novel deformable registration method, which is based on a cue-aware deep regression network, to deal with multiple databases with minimal parameter tuning. Methods: Our method learns and predicts the deformation field between a reference image and a subject image. Specifically, given a set of training images, our method learns the displacement vector associated with a pair of reference-subject patches. To achieve this, we first introduce a key-point truncated-balanced sampling strategy to facilitate accurate learning from the image database of limited size. Then, we design a cue-aware deep regression network, where we propose to employ the contextual cue, i.e., the scale-adaptive local similarity, to more apparently guide the learning process. The deep regression network is aware of the contextual cue for accurate prediction of local deformation. Results and Conclusion: Our experiments show that the proposed method can tackle various registration tasks on different databases, giving consistent good performance without the need of manual parameter tuning, which could be applicable to various clinical applications.
AbstractList Significance: Analysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse nature. Objective: We propose a novel deformable registration method, which is based on a cue-aware deep regression network, to deal with multiple databases with minimal parameter tuning. Methods: Our method learns and predicts the deformation field between a reference image and a subject image. Specifically, given a set of training images, our method learns the displacement vector associated with a pair of reference-subject patches. To achieve this, we first introduce a key-point truncated-balanced sampling strategy to facilitate accurate learning from the image database of limited size. Then, we design a cue-aware deep regression network, where we propose to employ the contextual cue, i.e., the scale-adaptive local similarity, to more apparently guide the learning process. The deep regression network is aware of the contextual cue for accurate prediction of local deformation. Results and Conclusion: Our experiments show that the proposed method can tackle various registration tasks on different databases, giving consistent good performance without the need of manual parameter tuning, which could be applicable to various clinical applications.
Analysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse nature.SIGNIFICANCEAnalysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse nature.We propose a novel deformable registration method, which is based on a cue-aware deep regression network, to deal with multiple databases with minimal parameter tuning.OBJECTIVEWe propose a novel deformable registration method, which is based on a cue-aware deep regression network, to deal with multiple databases with minimal parameter tuning.Our method learns and predicts the deformation field between a reference image and a subject image. Specifically, given a set of training images, our method learns the displacement vector associated with a pair of reference-subject patches. To achieve this, we first introduce a key-point truncated-balanced sampling strategy to facilitate accurate learning from the image database of limited size. Then, we design a cue-aware deep regression network, where we propose to employ the contextual cue, i.e., the scale-adaptive local similarity, to more apparently guide the learning process. The deep regression network is aware of the contextual cue for accurate prediction of local deformation.METHODSOur method learns and predicts the deformation field between a reference image and a subject image. Specifically, given a set of training images, our method learns the displacement vector associated with a pair of reference-subject patches. To achieve this, we first introduce a key-point truncated-balanced sampling strategy to facilitate accurate learning from the image database of limited size. Then, we design a cue-aware deep regression network, where we propose to employ the contextual cue, i.e., the scale-adaptive local similarity, to more apparently guide the learning process. The deep regression network is aware of the contextual cue for accurate prediction of local deformation.Our experiments show that the proposed method can tackle various registration tasks on different databases, giving consistent good performance without the need of manual parameter tuning, which could be applicable to various clinical applications.RESULTS AND CONCLUSIONOur experiments show that the proposed method can tackle various registration tasks on different databases, giving consistent good performance without the need of manual parameter tuning, which could be applicable to various clinical applications.
Analysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse nature. We propose a novel deformable registration method, which is based on a cue-aware deep regression network, to deal with multiple databases with minimal parameter tuning. Our method learns and predicts the deformation field between a reference image and a subject image. Specifically, given a set of training images, our method learns the displacement vector associated with a pair of reference-subject patches. To achieve this, we first introduce a key-point truncated-balanced sampling strategy to facilitate accurate learning from the image database of limited size. Then, we design a cue-aware deep regression network, where we propose to employ the contextual cue, i.e., the scale-adaptive local similarity, to more apparently guide the learning process. The deep regression network is aware of the contextual cue for accurate prediction of local deformation. Our experiments show that the proposed method can tackle various registration tasks on different databases, giving consistent good performance without the need of manual parameter tuning, which could be applicable to various clinical applications.
Author Yang, Jianhua
Wang, Qian
Shen, Dinggang
Cao, Xiaohuan
Yap, Pew-Thian
Zhang, Jun
Author_xml – sequence: 1
  givenname: Xiaohuan
  orcidid: 0000-0002-2413-114X
  surname: Cao
  fullname: Cao, Xiaohuan
  organization: School of AutomationNorthwestern Polytechnical University and also with the Department of Radiology and BRIC University of North Carolina at Chapel Hill
– sequence: 2
  givenname: Jianhua
  surname: Yang
  fullname: Yang, Jianhua
  organization: School of AutomationNorthwestern Polytechnical University
– sequence: 3
  givenname: Jun
  orcidid: 0000-0001-5579-7094
  surname: Zhang
  fullname: Zhang, Jun
  organization: Department of Radiology and BRICUniversity of North Carolina at Chapel Hill
– sequence: 4
  givenname: Qian
  surname: Wang
  fullname: Wang, Qian
  email: wang.qian@sjtu.edu.cn
  organization: School of Biomedical Engineering, Institute for Medical Imaging Technology, Shanghai Jiao Tong University, Shanghai, China
– sequence: 5
  givenname: Pew-Thian
  orcidid: 0000-0003-1489-2102
  surname: Yap
  fullname: Yap, Pew-Thian
  organization: Department of Radiology and BRICUniversity of North Carolina at Chapel Hill
– sequence: 6
  givenname: Dinggang
  orcidid: 0000-0002-7934-5698
  surname: Shen
  fullname: Shen, Dinggang
  email: dgshen@med.unc.edu
  organization: Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
BackLink https://www.ncbi.nlm.nih.gov/pubmed/29993391$$D View this record in MEDLINE/PubMed
BookMark eNp9kV9rFDEUxYNU7Lb6AUSQAV98mTU3yeTPi1C3tRaqgrTPITN7Z02dmazJjMVvb8ZdF-2DISSE_M7NuTkn5GgIAxLyHOgSgJo3N-8-XiwZBb1kmuUpH5EFVJUuWcXhiCxovioNM-KYnKR0l49CC_mEHDNjDOcGFuTyHNsQe1d3WFz1boPFF9z4NEY3-jAUt8kPm8IVqwnLs3sXsThH3M5MxJRm4hOO9yF-e0oet65L-Gy_n5Lb9xc3qw_l9efLq9XZddkIKsYS6qblFVOgtVKCom7XlXDrBqQE7iojaqekrEwtXFsBU5IKhLyudcVVLWp-St7u6m6nusd1g0O22tlt9L2LP21w3v57M_ivdhN-WAlKa05zgdf7AjF8nzCNtvepwa5zA4YpWUal5oIZpTL66gF6F6Y45PYsA1DAszOdqZd_OzpY-fPHGYAd0MSQUsT2gAC1c452ztHOOdp9jlmjHmgaP_6OJDflu_8qX-yUHhEPL2nOIQ_-C26dqJQ
CODEN IEBEAX
CitedBy_id crossref_primary_10_1002_acm2_12968
crossref_primary_10_1088_1361_6560_ad67a6
crossref_primary_10_1016_j_neuroimage_2022_119444
crossref_primary_10_1016_j_bspc_2024_106172
crossref_primary_10_1109_TBME_2023_3280463
crossref_primary_10_1002_mp_15420
crossref_primary_10_1016_j_compmedimag_2022_102112
crossref_primary_10_1177_11795549241303606
crossref_primary_10_1016_j_neucom_2020_08_085
crossref_primary_10_1109_ACCESS_2022_3174360
crossref_primary_10_3390_brainsci15010046
crossref_primary_10_1109_JBHI_2020_3045977
crossref_primary_10_1109_TBME_2018_2885436
crossref_primary_10_1109_JBHI_2020_3016699
crossref_primary_10_1007_s11517_022_02725_7
crossref_primary_10_1016_j_compbiomed_2024_108948
crossref_primary_10_1016_j_media_2019_101545
crossref_primary_10_1117_1_JEI_33_5_053055
crossref_primary_10_1016_j_media_2021_102292
crossref_primary_10_1088_1361_6560_ad9e69
crossref_primary_10_3390_s23063208
crossref_primary_10_1016_j_compbiomed_2022_105780
crossref_primary_10_1109_TMI_2021_3059282
crossref_primary_10_1049_ipr2_13215
crossref_primary_10_3390_s21062200
crossref_primary_10_1007_s10489_022_04109_8
crossref_primary_10_1109_TNNLS_2022_3141119
crossref_primary_10_1109_TPAMI_2023_3243040
crossref_primary_10_1002_ima_23171
crossref_primary_10_1109_TMI_2019_2896170
crossref_primary_10_1016_j_compmedimag_2023_102322
crossref_primary_10_1016_j_media_2020_101638
crossref_primary_10_1002_mp_13994
crossref_primary_10_1016_j_inffus_2023_102061
crossref_primary_10_1016_j_neunet_2024_106426
crossref_primary_10_1016_j_neucom_2020_04_122
crossref_primary_10_1088_1361_6560_ad6952
crossref_primary_10_1088_2057_1976_ab446b
crossref_primary_10_1109_TRPMS_2021_3107454
crossref_primary_10_1016_j_media_2023_102740
crossref_primary_10_1007_s10334_019_00782_y
crossref_primary_10_1109_TPAMI_2018_2889096
crossref_primary_10_1007_s10334_023_01144_5
crossref_primary_10_1109_ACCESS_2020_3015504
crossref_primary_10_1016_j_media_2023_103034
crossref_primary_10_1002_mp_16291
crossref_primary_10_1016_j_neucom_2022_11_088
crossref_primary_10_1016_j_bspc_2019_101562
crossref_primary_10_1109_ACCESS_2020_3047829
crossref_primary_10_1007_s10489_021_03062_2
crossref_primary_10_1186_s12880_021_00636_x
crossref_primary_10_1088_1361_6560_ac0afc
crossref_primary_10_1007_s11063_023_11311_3
crossref_primary_10_3390_app10031171
crossref_primary_10_1016_j_eij_2024_100558
crossref_primary_10_1016_j_zemedi_2018_11_002
crossref_primary_10_1016_j_patcog_2019_107171
crossref_primary_10_1002_mrm_28688
crossref_primary_10_1016_j_ymeth_2020_09_007
crossref_primary_10_3389_fnins_2020_620235
crossref_primary_10_1016_j_media_2020_101817
crossref_primary_10_1002_mp_14464
crossref_primary_10_1007_s10489_021_02196_7
crossref_primary_10_1515_revneuro_2023_0115
crossref_primary_10_1002_mp_14935
crossref_primary_10_1186_s12893_023_01944_5
crossref_primary_10_1088_1361_6560_ab843e
crossref_primary_10_1007_s12021_020_09483_7
crossref_primary_10_1016_j_media_2023_102962
crossref_primary_10_1016_j_patrec_2021_08_032
crossref_primary_10_1016_j_media_2024_103351
crossref_primary_10_1109_TMI_2019_2953788
crossref_primary_10_1007_s10489_022_04329_y
Cites_doi 10.1109/TMI.2016.2521800
10.1109/TMI.2002.803111
10.1109/TIP.2011.2170698
10.1109/TMI.2013.2265603
10.1016/S1361-8415(01)00036-6
10.1001/jama.2016.17216
10.1109/42.796284
10.1002/hbm.22233
10.1142/S0218001497000597
10.1088/0031-9155/46/3/201
10.1016/j.neuroimage.2017.07.008
10.1109/42.906424
10.1016/j.media.2017.05.004
10.1038/nature21056
10.1016/j.media.2010.07.002
10.1016/j.jneumeth.2004.07.014
10.1016/j.media.2008.03.006
10.1016/j.media.2007.06.004
10.1016/S1361-8415(01)80004-9
10.1016/j.media.2016.05.004
10.1016/S0262-8856(03)00137-9
10.1097/00004728-199801000-00028
10.1007/978-3-642-15816-2_5
10.1109/42.563664
10.1023/B:VISI.0000043755.93987.aa
10.1109/TMI.2003.815867
10.1016/j.neuroimage.2011.09.015
10.1146/annurev-bioeng-071516-044442
10.1109/TMI.2007.904691
10.1016/j.neuroimage.2009.10.065
10.1016/j.neuroimage.2008.12.037
10.1016/S1361-8415(98)80022-4
10.1016/S0031-3203(98)00095-8
10.1109/TMI.2014.2330355
10.1016/j.neuroimage.2008.10.040
10.1016/S1361-8415(01)80026-8
10.1016/j.neuroimage.2014.06.077
10.1016/j.media.2016.06.030
10.1007/978-3-319-47157-0_5
10.1097/00004728-199801000-00027
10.1016/j.media.2014.10.007
10.1016/j.neuroimage.2013.04.114
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
DBID 97E
RIA
RIE
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
5PM
DOI 10.1109/TBME.2018.2822826
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE/IET Electronic Library
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Ceramic Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Materials Research Database
Civil Engineering Abstracts
Aluminium Industry Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Ceramic Abstracts
Materials Business File
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Aerospace Database
Engineered Materials Abstracts
Biotechnology Research Abstracts
Solid State and Superconductivity Abstracts
Engineering Research Database
Corrosion Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList Materials Research Database

MEDLINE - Academic
MEDLINE
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Engineering
EISSN 1558-2531
EndPage 1911
ExternalDocumentID PMC6178830
29993391
10_1109_TBME_2018_2822826
8331111
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
Research Support, N.I.H., Extramural
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61473190; 81471733; 61401271
– fundername: NIH
  grantid: CA206100; AG053867
– fundername: National Key Research and Development Program of China
  grantid: 2017YFC0107600
– fundername: Science and Technology Commission of Shanghai Municipality
  grantid: 16511101100; 16410722400
– fundername: NCI NIH HHS
  grantid: R01 CA206100
– fundername: NIA NIH HHS
  grantid: RF1 AG053867
GroupedDBID ---
-~X
.55
.DC
.GJ
0R~
29I
4.4
53G
5GY
5RE
5VS
6IF
6IK
6IL
6IN
85S
97E
AAJGR
AARMG
AASAJ
AAWTH
AAYJJ
ABAZT
ABJNI
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACKIV
ACNCT
ACPRK
ADZIZ
AENEX
AETIX
AFFNX
AFRAH
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CHZPO
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IAAWW
IBMZZ
ICLAB
IDIHD
IEGSK
IFIPE
IFJZH
IPLJI
JAVBF
LAI
MS~
O9-
OCL
P2P
RIA
RIE
RIL
RNS
TAE
TN5
VH1
VJK
X7M
ZGI
ZXP
AAYXX
CITATION
RIG
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
5PM
ID FETCH-LOGICAL-c404t-1bcf35271887740e8fd54adc16613a594ba76659b4af5127604e1760d8537b4b3
IEDL.DBID RIE
ISSN 0018-9294
1558-2531
IngestDate Thu Aug 21 14:11:28 EDT 2025
Thu Jul 10 20:34:41 EDT 2025
Mon Jun 30 08:35:50 EDT 2025
Mon Jul 21 06:00:20 EDT 2025
Thu Apr 24 23:10:57 EDT 2025
Tue Jul 01 03:28:30 EDT 2025
Wed Aug 27 02:30:44 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 9
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
Personal use is permitted, but republication redistribution requires IEEE permission. See https://www.ieee.org/publications/rights/index.html for more information.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c404t-1bcf35271887740e8fd54adc16613a594ba76659b4af5127604e1760d8537b4b3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-5579-7094
0000-0002-2413-114X
0000-0002-7934-5698
0000-0003-1489-2102
PMID 29993391
PQID 2117131768
PQPubID 85474
PageCount 12
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_6178830
pubmed_primary_29993391
crossref_primary_10_1109_TBME_2018_2822826
proquest_journals_2117131768
crossref_citationtrail_10_1109_TBME_2018_2822826
ieee_primary_8331111
proquest_miscellaneous_2068342977
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2018-09-01
PublicationDateYYYYMMDD 2018-09-01
PublicationDate_xml – month: 09
  year: 2018
  text: 2018-09-01
  day: 01
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on biomedical engineering
PublicationTitleAbbrev TBME
PublicationTitleAlternate IEEE Trans Biomed Eng
PublicationYear 2018
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref56
ref12
ref15
ref14
ref55
ref10
ref17
ref19
ref18
de vos (ref53) 0
wu (ref50) 0
gutiérrez-becker (ref47) 0
ronneberger (ref5) 0
esteva (ref2) 2017; 542
ref45
ref48
ref42
ref41
ref44
ref43
ref9
ref4
ref3
ref6
wolterink (ref7) 0
cao (ref46) 0
ref35
rohé (ref51) 0
ref34
ref37
ref36
ref31
ref33
ref32
ref1
ref39
ref38
sokooti (ref11) 0
kim (ref8) 2012; 21
hellier (ref40) 0
hellier (ref25) 0
ref24
ref23
ref26
ref20
ref22
ref21
ref28
rueckert (ref16) 2010
ref27
ref29
pluim (ref30) 0
krebs (ref52) 0
ma (ref49) 0
li (ref54) 2017
References_xml – ident: ref48
  doi: 10.1109/TMI.2016.2521800
– ident: ref28
  doi: 10.1109/TMI.2002.803111
– volume: 21
  start-page: 1823
  year: 2012
  ident: ref8
  article-title: A general fast registration framework by learning deformation-Appearance correlation
  publication-title: IEEE Trans Image Process
  doi: 10.1109/TIP.2011.2170698
– ident: ref13
  doi: 10.1109/TMI.2013.2265603
– start-page: 266
  year: 0
  ident: ref51
  article-title: SVF-Net: Learning deformable image registration using shape matching
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
– ident: ref55
  doi: 10.1016/S1361-8415(01)00036-6
– ident: ref3
  doi: 10.1001/jama.2016.17216
– ident: ref35
  doi: 10.1109/42.796284
– ident: ref29
  doi: 10.1002/hbm.22233
– ident: ref27
  doi: 10.1142/S0218001497000597
– start-page: 649
  year: 0
  ident: ref50
  article-title: Unsupervised deep feature learning for deformable registration of MR brain images
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
– start-page: 590
  year: 0
  ident: ref40
  article-title: Intersubject registration of functional and anatomical data using SPM
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
– start-page: 344
  year: 0
  ident: ref52
  article-title: Robust nonrigid registration through agent-based action learning
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
– ident: ref15
  doi: 10.1088/0031-9155/46/3/201
– start-page: 19
  year: 0
  ident: ref47
  article-title: Learning optimization updates for multimodal registration
  publication-title: Proc Med Image Comput Comput -Assisted Intervention
– ident: ref10
  doi: 10.1016/j.neuroimage.2017.07.008
– start-page: 234
  year: 0
  ident: ref5
  article-title: U-net: Convolutional networks for biomedical image segmentation
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
– start-page: 452
  year: 0
  ident: ref30
  article-title: Image registration by maximization of combined mutual information and gradient information
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
– ident: ref56
  doi: 10.1109/42.906424
– ident: ref45
  doi: 10.1016/j.media.2017.05.004
– volume: 542
  start-page: 115
  year: 2017
  ident: ref2
  article-title: Dermatologist-level classification of skin cancer with deep neural networks
  publication-title: Nature
  doi: 10.1038/nature21056
– ident: ref36
  doi: 10.1016/j.media.2010.07.002
– ident: ref26
  doi: 10.1016/j.jneumeth.2004.07.014
– ident: ref41
  doi: 10.1016/j.media.2008.03.006
– ident: ref34
  doi: 10.1016/j.media.2007.06.004
– ident: ref32
  doi: 10.1016/S1361-8415(01)80004-9
– ident: ref6
  doi: 10.1016/j.media.2016.05.004
– ident: ref17
  doi: 10.1016/S0262-8856(03)00137-9
– ident: ref24
  doi: 10.1097/00004728-199801000-00028
– start-page: 131
  year: 2010
  ident: ref16
  article-title: Medical image registration
  publication-title: Biomedical Image Processing
  doi: 10.1007/978-3-642-15816-2_5
– ident: ref33
  doi: 10.1109/42.563664
– ident: ref37
  doi: 10.1023/B:VISI.0000043755.93987.aa
– start-page: 204
  year: 0
  ident: ref53
  article-title: End-to-end unsupervised deformable image registration with a convolutional neural network
  publication-title: Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support
– ident: ref31
  doi: 10.1109/TMI.2003.815867
– ident: ref22
  doi: 10.1016/j.neuroimage.2011.09.015
– ident: ref1
  doi: 10.1146/annurev-bioeng-071516-044442
– ident: ref18
  doi: 10.1109/TMI.2007.904691
– ident: ref38
  doi: 10.1016/j.neuroimage.2009.10.065
– start-page: 240
  year: 0
  ident: ref49
  article-title: Multimodal image registration with deep context reinforcement learning
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
– ident: ref43
  doi: 10.1016/j.neuroimage.2008.12.037
– year: 2017
  ident: ref54
  article-title: Nonrigid image registration using fully convolutional networks with deep self-supervision
– ident: ref20
  doi: 10.1016/S1361-8415(98)80022-4
– ident: ref14
  doi: 10.1016/S0031-3203(98)00095-8
– ident: ref42
  doi: 10.1109/TMI.2014.2330355
– start-page: 1
  year: 0
  ident: ref46
  article-title: Learning-based multimodal image registration for prostate cancer radiation therapy
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
– ident: ref21
  doi: 10.1016/j.neuroimage.2008.10.040
– ident: ref12
  doi: 10.1016/S1361-8415(01)80026-8
– ident: ref4
  doi: 10.1016/j.neuroimage.2014.06.077
– ident: ref19
  doi: 10.1016/j.media.2016.06.030
– ident: ref44
  doi: 10.1007/978-3-319-47157-0_5
– ident: ref23
  doi: 10.1097/00004728-199801000-00027
– start-page: 590
  year: 0
  ident: ref25
  article-title: Inter-subject registration of functional and anatomical data using SPM
  publication-title: Proc Med Image Comput Comput -Assisted Intervention
– ident: ref9
  doi: 10.1016/j.media.2014.10.007
– start-page: 14
  year: 0
  ident: ref7
  article-title: Deep MR to CT synthesis using unpaired data
– ident: ref39
  doi: 10.1016/j.neuroimage.2013.04.114
– start-page: 232
  year: 0
  ident: ref11
  article-title: Nonrigid image registration using multiscale 3-D convolutional neural networks
  publication-title: Proc Int Conf Med Image Comput Comput -Assisted Intervention
SSID ssj0014846
Score 2.5937665
Snippet Significance: Analysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse...
Analysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse nature. We propose...
Analysis of modern large-scale, multicenter or diseased data requires deformable registration algorithms that can cope with data of diverse...
SourceID pubmedcentral
proquest
pubmed
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1900
SubjectTerms Algorithms
Brain - diagnostic imaging
Databases, Factual
Deep Learning
Deformable registration
Deformation
Formability
Humans
Image Processing, Computer-Assisted - methods
Image registration
Imaging, Three-Dimensional
key-points sampling
Machine learning
Nonlinear Dynamics
nonlinear regression
Parameters
Registration
Regression analysis
Strain
Task analysis
Therapeutic applications
Tomography, X-Ray Computed
Training
Tuning
Title Deformable Image Registration Using a Cue-Aware Deep Regression Network
URI https://ieeexplore.ieee.org/document/8331111
https://www.ncbi.nlm.nih.gov/pubmed/29993391
https://www.proquest.com/docview/2117131768
https://www.proquest.com/docview/2068342977
https://pubmed.ncbi.nlm.nih.gov/PMC6178830
Volume 65
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB61PSA48GiBBgoyEidEts7a8eNY-qAgbQ-olXqLbGdSEG22Khsh8esZx9moW1WISxTJk8j2eDLfZD6PAd7r2gpSdZ0XbkoBivImNw13uWicj9Gzdv0pCrMTdXwmv56X52vwcdwLg4g9-Qwn8bbP5dfz0MVfZbtGiGjh67BOgVvaqzVmDKRJm3J4QQY8tXLIYBbc7p5-mh1GEpeZRM4k4elYAZiAkRC2WHFH_fkq90HNu4zJWy7o6AnMlp1PzJOfk27hJ-HPnbqO_zu6p_B4wKJsLy2eZ7CG7SY8ulWhcBMezIbc-xZ8PsAe4PpLZF-u6DPEvuHFWHaX9dwD5th-h_neb3eD7ADxOsokpm3LThLj_DmcHR2e7h_nwzEMeZBcLvLCh4ZgGjkxQ1iRo2nqUro6FOTahSut9E4rVVovXUPwQSsusaBrTUhAe-nFC9ho5y1uA7NGKcIYGKbBSW1LZxtVqyD11NsgDM-AL7VRhaFGeTwq47LqYxVuq6jLKuqyGnSZwYfxketUoONfwltx3kfBYcoz2FmqvBpM-FdFkTEF8DQQk8G7sZmML2ZUXIvzjmS4MoI8utYZvEwrZHz3coVloFfWzigQC3uvtrQ_vvcFvuO2TSP4q_t7-xoexjElotsObCxuOnxDyGjh3_Ym8RebDAWS
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VIvE48GgLBAoYiRMiW2ft-HEsfbCFZg9oK_UW2Y4DVUu2Khsh8esZO9moW1WISxQpkyj2eDLfZL4ZA7yXlWao6irNzBgDFGFVqmpqUlYbG6JnaeIuCsVUTE74l9P8dA0-DrUw3vtIPvOjcBpz-dXcteFX2Y5iLFj4Hbibh2LcrlpryBlw1ZXl0AxNeKx5n8PMqN6ZfSoOAo1LjQJrEhF16AGM0Igxna04pLjDym1g8yZn8poTOnwMxfL1O-7J-ahd2JH7c6Oz4_-O7wk86tEo2e2Wz1NY880GPLzWo3AD7hV99n0TPu_7CHHthSdHP_FDRL7570PjXRLZB8SQvdanu7_NlSf73l8GmY5r25BpxznfgpPDg9neJO03Ykgdp3yRZtbVCNTQjSlEi9Srusq5qVyGzp2ZXHNrpBC5ttzUCCCkoNxneKwQC0jLLXsG68288S-AaCUEogzvxs5wqXOja1EJx-XYascUTYAutVG6vkt52CzjoozRCtVl0GUZdFn2ukzgw3DLZdei41_Cm2HeB8F-yhPYXqq87I34V4mxMYbwOBCVwLvhMppfyKmYxs9blKFCMfTpUibwvFshw7OXKywBubJ2BoHQ2nv1SnP2I7b4DoWbitGXt7_tW7g_mRXH5fHR9OsreBDG19HetmF9cdX614iTFvZNNI-_Xu8I2g
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Deformable+Image+Registration+Using+Cue-aware+Deep+Regression+Network&rft.jtitle=IEEE+transactions+on+biomedical+engineering&rft.au=Cao%2C+Xiaohuan&rft.au=Yang%2C+Jianhua&rft.au=Zhang%2C+Jun&rft.au=Wang%2C+Qian&rft.date=2018-09-01&rft.issn=0018-9294&rft.eissn=1558-2531&rft.volume=65&rft.issue=9&rft.spage=1900&rft.epage=1911&rft_id=info:doi/10.1109%2FTBME.2018.2822826&rft_id=info%3Apmid%2F29993391&rft.externalDocID=PMC6178830
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0018-9294&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0018-9294&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0018-9294&client=summon