Contrastive Learning for Prediction of Alzheimer's Disease Using Brain 18F-FDG PET

Brain 18F-FDG PET images are commonly-known materials for effectively predicting Alzheimer's disease (AD). However, the data volume of PET is usually insufficient, which is unfavorable to train an accurate AD prediction networks. Furthermore, the PET image is noisy with low signal-to-noise rati...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of biomedical and health informatics Vol. 27; no. 4; pp. 1735 - 1746
Main Authors Chen, Yonglin, Wang, Huabin, Zhang, Gong, Liu, Xiao, Huang, Wei, Han, Xianjun, Li, Xuejun, Martin, Melanie, Tao, Liang
Format Journal Article
LanguageEnglish
Published United States IEEE 01.04.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Brain 18F-FDG PET images are commonly-known materials for effectively predicting Alzheimer's disease (AD). However, the data volume of PET is usually insufficient, which is unfavorable to train an accurate AD prediction networks. Furthermore, the PET image is noisy with low signal-to-noise ratio, and simultaneously the feature (metabolic abnormality) used for predicting AD in PET image is not always obvious. Therefore, a contrastive-based learning method is proposed to address the challenges of PET image inherently possessed. Firstly, the slices of 3D PET image are amplified by cropping the image of anchors (i.e., an augmented version of the same image) to generate extended training data. Meanwhile, contrastive loss is adopted to enlarge inter-class feature distances and reduce intra-class feature differences using subject fuzzy labels as supervised information. Secondly, we construct a double convolutional hybrid attention module to enhance the network to learn different perceptual domains where two convolutional layers with different convolutional kernels (<inline-formula><tex-math notation="LaTeX">7\times 7</tex-math></inline-formula> and <inline-formula><tex-math notation="LaTeX">5\times 5</tex-math></inline-formula>) are constructed. Moreover, we recommend a diagnosis mechanism by analyzing the consistency of predicted result for PET slices alone with clinical neuropsychological assessment to achieve a better AD diagnosis. The experimental results show that the proposed method outperforms the state-of-the-arts for brain 18F-FDG PET images, and hence demonstrate the advantage of the method in effectively predicting AD.
AbstractList Brain 18F-FDG PET images are commonly-known materials for effectively predicting Alzheimer's disease (AD). However, the data volume of PET is usually insufficient, which is unfavorable to train an accurate AD prediction networks. Furthermore, the PET image is noisy with low signal-to-noise ratio, and simultaneously the feature (metabolic abnormality) used for predicting AD in PET image is not always obvious. Therefore, a contrastive-based learning method is proposed to address the challenges of PET image inherently possessed. Firstly, the slices of 3D PET image are amplified by cropping the image of anchors (i.e., an augmented version of the same image) to generate extended training data. Meanwhile, contrastive loss is adopted to enlarge inter-class feature distances and reduce intra-class feature differences using subject fuzzy labels as supervised information. Secondly, we construct a double convolutional hybrid attention module to enhance the network to learn different perceptual domains where two convolutional layers with different convolutional kernels ($7\times 7$ and $5\times 5$) are constructed. Moreover, we recommend a diagnosis mechanism by analyzing the consistency of predicted result for PET slices alone with clinical neuropsychological assessment to achieve a better AD diagnosis. The experimental results show that the proposed method outperforms the state-of-the-arts for brain 18F-FDG PET images, and hence demonstrate the advantage of the method in effectively predicting AD.Brain 18F-FDG PET images are commonly-known materials for effectively predicting Alzheimer's disease (AD). However, the data volume of PET is usually insufficient, which is unfavorable to train an accurate AD prediction networks. Furthermore, the PET image is noisy with low signal-to-noise ratio, and simultaneously the feature (metabolic abnormality) used for predicting AD in PET image is not always obvious. Therefore, a contrastive-based learning method is proposed to address the challenges of PET image inherently possessed. Firstly, the slices of 3D PET image are amplified by cropping the image of anchors (i.e., an augmented version of the same image) to generate extended training data. Meanwhile, contrastive loss is adopted to enlarge inter-class feature distances and reduce intra-class feature differences using subject fuzzy labels as supervised information. Secondly, we construct a double convolutional hybrid attention module to enhance the network to learn different perceptual domains where two convolutional layers with different convolutional kernels ($7\times 7$ and $5\times 5$) are constructed. Moreover, we recommend a diagnosis mechanism by analyzing the consistency of predicted result for PET slices alone with clinical neuropsychological assessment to achieve a better AD diagnosis. The experimental results show that the proposed method outperforms the state-of-the-arts for brain 18F-FDG PET images, and hence demonstrate the advantage of the method in effectively predicting AD.
Brain 18F-FDG PET images are commonly-known materials for effectively predicting Alzheimer's disease (AD). However, the data volume of PET is usually insufficient, which is unfavorable to train an accurate AD prediction networks. Furthermore, the PET image is noisy with low signal-to-noise ratio, and simultaneously the feature (metabolic abnormality) used for predicting AD in PET image is not always obvious. Therefore, a contrastive-based learning method is proposed to address the challenges of PET image inherently possessed. Firstly, the slices of 3D PET image are amplified by cropping the image of anchors (i.e., an augmented version of the same image) to generate extended training data. Meanwhile, contrastive loss is adopted to enlarge inter-class feature distances and reduce intra-class feature differences using subject fuzzy labels as supervised information. Secondly, we construct a double convolutional hybrid attention module to enhance the network to learn different perceptual domains where two convolutional layers with different convolutional kernels ([Formula Omitted] and [Formula Omitted]) are constructed. Moreover, we recommend a diagnosis mechanism by analyzing the consistency of predicted result for PET slices alone with clinical neuropsychological assessment to achieve a better AD diagnosis. The experimental results show that the proposed method outperforms the state-of-the-arts for brain 18F-FDG PET images, and hence demonstrate the advantage of the method in effectively predicting AD.
Brain 18F-FDG PET images are commonly-known materials for effectively predicting Alzheimer's disease (AD). However, the data volume of PET is usually insufficient, which is unfavorable to train an accurate AD prediction networks. Furthermore, the PET image is noisy with low signal-to-noise ratio, and simultaneously the feature (metabolic abnormality) used for predicting AD in PET image is not always obvious. Therefore, a contrastive-based learning method is proposed to address the challenges of PET image inherently possessed. Firstly, the slices of 3D PET image are amplified by cropping the image of anchors (i.e., an augmented version of the same image) to generate extended training data. Meanwhile, contrastive loss is adopted to enlarge inter-class feature distances and reduce intra-class feature differences using subject fuzzy labels as supervised information. Secondly, we construct a double convolutional hybrid attention module to enhance the network to learn different perceptual domains where two convolutional layers with different convolutional kernels (<inline-formula><tex-math notation="LaTeX">7\times 7</tex-math></inline-formula> and <inline-formula><tex-math notation="LaTeX">5\times 5</tex-math></inline-formula>) are constructed. Moreover, we recommend a diagnosis mechanism by analyzing the consistency of predicted result for PET slices alone with clinical neuropsychological assessment to achieve a better AD diagnosis. The experimental results show that the proposed method outperforms the state-of-the-arts for brain 18F-FDG PET images, and hence demonstrate the advantage of the method in effectively predicting AD.
Brain 18F-FDG PET images are commonly-known materials for effectively predicting Alzheimer's disease (AD). However, the data volume of PET is usually insufficient, which is unfavorable to train an accurate AD prediction networks. Furthermore, the PET image is noisy with low signal-to-noise ratio, and simultaneously the feature (metabolic abnormality) used for predicting AD in PET image is not always obvious. Therefore, a contrastive-based learning method is proposed to address the challenges of PET image inherently possessed. Firstly, the slices of 3D PET image are amplified by cropping the image of anchors (i.e., an augmented version of the same image) to generate extended training data. Meanwhile, contrastive loss is adopted to enlarge inter-class feature distances and reduce intra-class feature differences using subject fuzzy labels as supervised information. Secondly, we construct a double convolutional hybrid attention module to enhance the network to learn different perceptual domains where two convolutional layers with different convolutional kernels ($7\times 7$ and $5\times 5$) are constructed. Moreover, we recommend a diagnosis mechanism by analyzing the consistency of predicted result for PET slices alone with clinical neuropsychological assessment to achieve a better AD diagnosis. The experimental results show that the proposed method outperforms the state-of-the-arts for brain 18F-FDG PET images, and hence demonstrate the advantage of the method in effectively predicting AD.
Author Tao, Liang
Martin, Melanie
Huang, Wei
Li, Xuejun
Zhang, Gong
Chen, Yonglin
Liu, Xiao
Han, Xianjun
Wang, Huabin
Author_xml – sequence: 1
  givenname: Yonglin
  orcidid: 0000-0002-3243-663X
  surname: Chen
  fullname: Chen, Yonglin
  email: e20101001@stu.ahu.edu.cn
  organization: Anhui Provincial International Joint Research Center for Advanced Technology in Medical Imaging, Anhui University, Hefei, China
– sequence: 2
  givenname: Huabin
  orcidid: 0000-0001-5938-5409
  surname: Wang
  fullname: Wang, Huabin
  email: wanghuabin@ahu.edu.cn
  organization: Anhui Provincial International Joint Research Center for Advanced Technology in Medical Imaging, Anhui University, Hefei, China
– sequence: 3
  givenname: Gong
  surname: Zhang
  fullname: Zhang, Gong
  email: umzhan00@gmail.com
  organization: Anhui Provincial International Joint Research Center for Advanced Technology in Medical Imaging, Anhui University, Hefei, China
– sequence: 4
  givenname: Xiao
  orcidid: 0000-0001-8400-5754
  surname: Liu
  fullname: Liu, Xiao
  email: xiao.liu@deakin.edu.au
  organization: School of Information Technology, Deakin University, Geelong, VIC, Australia
– sequence: 5
  givenname: Wei
  surname: Huang
  fullname: Huang, Wei
  email: 09057@ahu.edu.cn
  organization: Anhui Provincial International Joint Research Center for Advanced Technology in Medical Imaging, Anhui University, Hefei, China
– sequence: 6
  givenname: Xianjun
  orcidid: 0000-0001-7674-1428
  surname: Han
  fullname: Han, Xianjun
  email: hxj@ahu.edu.cn
  organization: Anhui Provincial International Joint Research Center for Advanced Technology in Medical Imaging, Anhui University, Hefei, China
– sequence: 7
  givenname: Xuejun
  orcidid: 0000-0001-6630-2958
  surname: Li
  fullname: Li, Xuejun
  email: xjli@ahu.edu.cn
  organization: Anhui Provincial International Joint Research Center for Advanced Technology in Medical Imaging, Anhui University, Hefei, China
– sequence: 8
  givenname: Melanie
  surname: Martin
  fullname: Martin, Melanie
  email: m.martin@uwinnipeg.ca
  organization: University of Winnipeg, Winnipeg, MB, Canada
– sequence: 9
  givenname: Liang
  orcidid: 0000-0001-8553-8421
  surname: Tao
  fullname: Tao, Liang
  email: taoliang@ahu.edu.cn
  organization: Anhui Provincial International Joint Research Center for Advanced Technology in Medical Imaging, Anhui University, Hefei, China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/37015664$$D View this record in MEDLINE/PubMed
BookMark eNp9kU1LAzEQhoMofv8AESTgQS9bk-xXctRqq1JQRM8hTWY1sk002Qr6683S1kMPzmWG8Lwz4X330KbzDhA6omRAKREX91e3dwNGGBvkLKeClBtol9GKZ4wRvrmaqSh20GGM7yQVT0-i2kY7eU1oWVXFLnoaetcFFTv7BXgCKjjrXnHjA34MYKzurHfYN_iy_XkDO4NwFvG1jaAi4JfYs1dBWYcpH2Wj6zF-vHk-QFuNaiMcLvs-ehndPA9vs8nD-G54Ocl0Xoguq6Hg1DSl0qZghoE2oGvNcihJw4loTE5KAozSWnPDldZTURW8BKOgrmHa5PvofLH3I_jPOcROzmzU0LbKgZ9HyWpR0Yrwskjo6Rr67ufBpd_1VMlJfytRJ0tqPp2BkR_BzlT4liu3ElAvAB18jAEaqW2neouShbaVlMg-GtlHI_to5DKapKRrytXy_zTHC40FgD9epCKU5b9impal
CODEN IJBHA9
CitedBy_id crossref_primary_10_1109_ACCESS_2024_3418508
crossref_primary_10_1109_JTEHM_2023_3344035
crossref_primary_10_3389_fnins_2023_1272834
crossref_primary_10_1016_j_neulet_2023_137530
crossref_primary_10_1109_TFUZZ_2024_3409412
crossref_primary_10_1007_s10462_024_11041_5
crossref_primary_10_1016_j_eswa_2024_124780
crossref_primary_10_1088_1361_6560_acfec8
crossref_primary_10_1109_TNSRE_2025_3549730
crossref_primary_10_3390_bioengineering11030219
Cites_doi 10.3389/fnagi.2021.764872
10.1109/JBHI.2021.3113668
10.1016/j.biopha.2017.12.053
10.1007/s12021-018-9370-4
10.1109/CVPR.2019.00020
10.1007/978-3-030-58621-8_45
10.1007/s00429-013-0687-3
10.1016/j.neucom.2019.04.093
10.1145/3240508.3240550
10.1177/1533317520918719
10.1016/j.neurobiolaging.2017.09.007
10.1016/j.neucom.2018.11.111
10.1007/978-3-319-10590-1_53
10.3389/fdgth.2021.637386
10.1002/nbm.3329
10.1109/CVPR.2018.00393
10.1109/TMI.2020.3022591
10.3390/app11052187
10.1007/s00259-021-05556-0
10.1109/JBHI.2021.3097721
10.3389/fninf.2018.00035
10.1148/radiol.2018180958
10.1016/j.neuroimage.2011.01.008
10.1111/ene.13728
10.1109/TBME.2014.2372011
10.1109/MMSP.2015.7340796
10.48550/ARXIV.1807.06521
10.1186/s12938-020-00813-z
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
K9.
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
DOI 10.1109/JBHI.2022.3231905
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005-present
IEEE All-Society Periodicals Package (ASPP) 1998-Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Ceramic Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
ProQuest Health & Medical Complete (Alumni)
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Nursing & Allied Health Premium
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Materials Research Database
Civil Engineering Abstracts
Aluminium Industry Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
ProQuest Health & Medical Complete (Alumni)
Ceramic Abstracts
Materials Business File
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Aerospace Database
Nursing & Allied Health Premium
Engineered Materials Abstracts
Biotechnology Research Abstracts
Solid State and Superconductivity Abstracts
Engineering Research Database
Corrosion Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
Materials Research Database

PubMed
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
EISSN 2168-2208
EndPage 1746
ExternalDocumentID 37015664
10_1109_JBHI_2022_3231905
9999012
Genre orig-research
Journal Article
GrantInformation_xml – fundername: Natural Science Foundation for the Higher Education Institutions of Anhui Province
  grantid: 2022AH050091
– fundername: Natural Science Foundation of Anhui Province
  grantid: 1908085MF209
  funderid: 10.13039/501100003995
– fundername: National Natural Science Foundation of China
  grantid: 61972001; 62106005
  funderid: 10.13039/501100001809
GroupedDBID 0R~
4.4
6IF
6IH
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACIWK
ACPRK
AENEX
AFRAH
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
HZ~
IFIPE
IPLJI
JAVBF
M43
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
RIG
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
K9.
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
ID FETCH-LOGICAL-c349t-7e481df5acd42d2ecdec7c23e50f809fd3050e2117c8d8accb96485edae77ebf3
IEDL.DBID RIE
ISSN 2168-2194
2168-2208
IngestDate Fri Jul 11 12:20:57 EDT 2025
Sun Jun 29 13:23:00 EDT 2025
Sun Apr 06 01:21:17 EDT 2025
Tue Jul 01 03:00:04 EDT 2025
Thu Apr 24 23:09:40 EDT 2025
Wed Aug 27 02:06:09 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c349t-7e481df5acd42d2ecdec7c23e50f809fd3050e2117c8d8accb96485edae77ebf3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-8400-5754
0000-0001-6630-2958
0000-0001-5938-5409
0000-0002-3243-663X
0000-0001-8553-8421
0000-0001-7674-1428
PMID 37015664
PQID 2795803050
PQPubID 85417
PageCount 12
ParticipantIDs crossref_citationtrail_10_1109_JBHI_2022_3231905
crossref_primary_10_1109_JBHI_2022_3231905
proquest_journals_2795803050
pubmed_primary_37015664
proquest_miscellaneous_2796160854
ieee_primary_9999012
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2023-04-01
PublicationDateYYYYMMDD 2023-04-01
PublicationDate_xml – month: 04
  year: 2023
  text: 2023-04-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Piscataway
PublicationTitle IEEE journal of biomedical and health informatics
PublicationTitleAbbrev JBHI
PublicationTitleAlternate IEEE J Biomed Health Inform
PublicationYear 2023
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References Grill (ref12) 2020; 33
ref34
ref15
ref14
ref31
ref30
ref10
ref32
ref2
ref1
ref17
ref16
ref19
ref18
Mller (ref24) 2019; 32
Khosla (ref13) 2020; 33
ref23
ref26
ref20
ref22
ref21
ref28
ref27
Chen (ref25) 2022; 6
ref29
ref8
ref7
Maaten (ref33) 2008; 9
ref9
ref4
Kurakin (ref11) 2020
ref3
ref6
ref5
References_xml – ident: ref21
  doi: 10.3389/fnagi.2021.764872
– ident: ref23
  doi: 10.1109/JBHI.2021.3113668
– ident: ref1
  doi: 10.1016/j.biopha.2017.12.053
– ident: ref7
  doi: 10.1007/s12021-018-9370-4
– volume: 32
  start-page: 318
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  year: 2019
  ident: ref24
  article-title: When does label smoothing help
– ident: ref9
  doi: 10.1109/CVPR.2019.00020
– volume: 33
  start-page: 18661
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  year: 2020
  ident: ref13
  article-title: Supervised contrastive learning
– ident: ref14
  doi: 10.1007/978-3-030-58621-8_45
– ident: ref17
  doi: 10.1007/s00429-013-0687-3
– year: 2020
  ident: ref11
  article-title: Remixmatch: Semi-supervised learning with distribution alignment and augmentation anchoring
– ident: ref30
  doi: 10.1016/j.neucom.2019.04.093
– ident: ref16
  doi: 10.1145/3240508.3240550
– ident: ref3
  doi: 10.1177/1533317520918719
– ident: ref2
  doi: 10.1016/j.neurobiolaging.2017.09.007
– ident: ref4
  doi: 10.1016/j.neucom.2018.11.111
– ident: ref34
  doi: 10.1007/978-3-319-10590-1_53
– ident: ref26
  doi: 10.3389/fdgth.2021.637386
– ident: ref18
  doi: 10.1002/nbm.3329
– ident: ref10
  doi: 10.1109/CVPR.2018.00393
– ident: ref28
  doi: 10.1109/TMI.2020.3022591
– ident: ref8
  doi: 10.3390/app11052187
– volume: 9
  start-page: 2579
  issue: 11
  year: 2008
  ident: ref33
  article-title: Visualizing data using t-SNE
  publication-title: J. Mach. Learn. Res.
– ident: ref20
  doi: 10.1007/s00259-021-05556-0
– volume: 6
  issue: 2
  volume-title: Adv. Sens., Mater. Intell. Algorithms Multi-Domain Struct. Health Monit.
  year: 2022
  ident: ref25
  article-title: Quantitative monitoring of bolt looseness using multichannel piezoelectric active sensing and CBAM-based convolutional neural network
– ident: ref27
  doi: 10.1109/JBHI.2021.3097721
– ident: ref31
  doi: 10.3389/fninf.2018.00035
– ident: ref32
  doi: 10.1148/radiol.2018180958
– ident: ref19
  doi: 10.1016/j.neuroimage.2011.01.008
– volume: 33
  start-page: 21271
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  year: 2020
  ident: ref12
  article-title: Bootstrap your own latent-a new approach to self-supervised learning
– ident: ref6
  doi: 10.1111/ene.13728
– ident: ref22
  doi: 10.1109/TBME.2014.2372011
– ident: ref5
  doi: 10.1109/MMSP.2015.7340796
– ident: ref15
  doi: 10.48550/ARXIV.1807.06521
– ident: ref29
  doi: 10.1186/s12938-020-00813-z
SSID ssj0000816896
Score 2.465425
Snippet Brain 18F-FDG PET images are commonly-known materials for effectively predicting Alzheimer's disease (AD). However, the data volume of PET is usually...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1735
SubjectTerms Alzheimer's disease
Alzheimer's disease (AD)
Bioinformatics
Brain
brain 18F-FDG PET
Brain slice preparation
contrastive loss
Convolution
Correlation analysis
Deep learning
Diagnosis
Learning
Lesions
Medical imaging
multi-correlation analysis
multiattention mechanism
Neurodegenerative diseases
Neuroimaging
Noise prediction
Positron emission
Positron emission tomography
Signal to noise ratio
Training
Title Contrastive Learning for Prediction of Alzheimer's Disease Using Brain 18F-FDG PET
URI https://ieeexplore.ieee.org/document/9999012
https://www.ncbi.nlm.nih.gov/pubmed/37015664
https://www.proquest.com/docview/2795803050
https://www.proquest.com/docview/2796160854
Volume 27
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VHhAXoLRAoCAjIVWq8NabOHF87GtZKm1VoVbqLfJjUhBlF-3j0l_P2PFGoiqIW6T4kXjG488ez3wAHz3prLKl4EYpzaX0imuPlmtaCmxBALeNdD6T82p8Jc-uy-sN-NTHwiBivHyGg_AYffl-5lbhqOyAwAwtX2RwH9HGrYvV6s9TIoFEpOPK6YHTRJTJiTkU-uDsaPyFNoN5PigI0GgRCGsKFcKIK_nHihQpVv6ONuOqM3oGk_X3dpdNfgxWSztwd_dSOf7vDz2Hpwl-ssNOX7ZgA6cv4PEkOdi34WvIVjU3i2AEWcq9esMI2LKLeSgUxMhmLTu8vfuG33_ifG_BTjofD4u3D9hR4Jxgw3rERyef2cXp5Q5cjU4vj8c80S5wV0i95Aolgdi2NM7L3OfoPDrl8gJL0dZCt55MhEDaOCpX-9o4Z3Ul6xK9QaXQtsVL2JzOpvgamDPKFEOnpA0x917W1ANSC0basqCaGYj10Dcu5SQP1Bi3TdybCN0EwTVBcE0SXAb7fZVfXUKOfxXeDoPeF0zjncHuWr5NmrKLJle6rIP5Exl86F_TZAseFDPF2SqWqYYVoVSZwatOL_q21-r05uE-38KTwFTfXfrZhc3lfIXvCM8s7fuoyL8BehnsUA
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB5VRQIuvAo0tICRkJAQ3jqJE8fHlnbZlm5Voa3UW-TYE4oou2gfl_76jh1vJBAgbpHiR-IZez57xvMBvHWks6opBDdKaS6lU1w7bLgmU9DkBHDbQOczPitHF_LksrjcgA_9XRhEDMFnOPCPwZfvZnblj8r2CMyQ-aIF9w7Z_SLtbmv1JyqBQiIQcmX0wGkqyujGTIXeOzkYHdN2MMsGOUEaLTxlTa78ReJS_mKTAsnK3_FmsDvDhzBef3EXbvJ9sFo2A3vzWzLH__2lR_AgAlC232nMY9jA6RO4O44u9i344vNVzc3CL4MsZl_9ygjasvO5L-QFyWYt27--ucJvP3D-bsEOOy8PC_EH7MCzTrC0GvLh4Sd2fjR5ChfDo8nHEY_EC9zmUi-5Qkkwti2MdTJzGVqHVtksx0K0ldCto0VCIG0dla1cZaxtdCmrAp1BpbBp82ewOZ1NcRuYNcrkqVWy8bfunayoB6QWjGyKnGomINZDX9uYldyTY1zXYXcidO0FV3vB1VFwCbzvq_zsUnL8q_CWH_S-YBzvBHbX8q3jpF3UmdJF5RdAkcCb_jVNN-9DMVOcrUKZMi0Jp8oEnnd60be9VqcXf-7zNdwbTcan9enx2ecduO9567sQoF3YXM5X-JLQzbJ5FZT6Fvew75k
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Contrastive+Learning+for+Prediction+of+Alzheimer%27s+Disease+Using+Brain+18F-FDG+PET&rft.jtitle=IEEE+journal+of+biomedical+and+health+informatics&rft.au=Chen%2C+Yonglin&rft.au=Wang%2C+Huabin&rft.au=Zhang%2C+Gong&rft.au=Liu%2C+Xiao&rft.date=2023-04-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=2168-2194&rft.eissn=2168-2208&rft.volume=27&rft.issue=4&rft.spage=1735&rft_id=info:doi/10.1109%2FJBHI.2022.3231905&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2168-2194&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2168-2194&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2168-2194&client=summon