Training recurrent neural networks robust to incomplete data: Application to Alzheimer’s disease progression modeling
•We propose a generalized algorithm to train LSTM networks robust to incomplete data.•We introduce an end-to-end approach for biomarker modeling and clinical status prediction.•It is applied to model Alzheimer’s disease progression using volumetric MRI biomarkers.•Our proposed algorithm predicts bio...
Saved in:
Published in | Medical image analysis Vol. 53; pp. 39 - 46 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
Netherlands
Elsevier B.V
01.04.2019
Elsevier BV |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | •We propose a generalized algorithm to train LSTM networks robust to incomplete data.•We introduce an end-to-end approach for biomarker modeling and clinical status prediction.•It is applied to model Alzheimer’s disease progression using volumetric MRI biomarkers.•Our proposed algorithm predicts biomarker measurements with the lowest MAE.•This is the first time RNNs are applied for neurodegenerative disease progression modeling.
Disease progression modeling (DPM) using longitudinal data is a challenging machine learning task. Existing DPM algorithms neglect temporal dependencies among measurements, make parametric assumptions about biomarker trajectories, do not model multiple biomarkers jointly, and need an alignment of subjects’ trajectories. In this paper, recurrent neural networks (RNNs) are utilized to address these issues. However, in many cases, longitudinal cohorts contain incomplete data, which hinders the application of standard RNNs and requires a pre-processing step such as imputation of the missing values. Instead, we propose a generalized training rule for the most widely used RNN architecture, long short-term memory (LSTM) networks, that can handle both missing predictor and target values. The proposed LSTM algorithm is applied to model the progression of Alzheimer’s disease (AD) using six volumetric magnetic resonance imaging (MRI) biomarkers, i.e., volumes of ventricles, hippocampus, whole brain, fusiform, middle temporal gyrus, and entorhinal cortex, and it is compared to standard LSTM networks with data imputation and a parametric, regression-based DPM method. The results show that the proposed algorithm achieves a significantly lower mean absolute error (MAE) than the alternatives with p < 0.05 using Wilcoxon signed rank test in predicting values of almost all of the MRI biomarkers. Moreover, a linear discriminant analysis (LDA) classifier applied to the predicted biomarker values produces a significantly larger area under the receiver operating characteristic curve (AUC) of 0.90 vs. at most 0.84 with p < 0.001 using McNemar’s test for clinical diagnosis of AD. Inspection of MAE curves as a function of the amount of missing data reveals that the proposed LSTM algorithm achieves the best performance up until more than 74% missing values. Finally, it is illustrated how the method can successfully be applied to data with varying time intervals. This paper shows that built-in handling of missing values in training an LSTM network benefits the application of RNNs in neurodegenerative disease progression modeling in longitudinal cohorts. |
---|---|
AbstractList | Disease progression modeling (DPM) using longitudinal data is a challenging machine learning task. Existing DPM algorithms neglect temporal dependencies among measurements, make parametric assumptions about biomarker trajectories, do not model multiple biomarkers jointly, and need an alignment of subjects’ trajectories. In this paper, recurrent neural networks (RNNs) are utilized to address these issues. However, in many cases, longitudinal cohorts contain incomplete data, which hinders the application of standard RNNs and requires a pre-processing step such as imputation of the missing values. Instead, we propose a generalized training rule for the most widely used RNN architecture, long short-term memory (LSTM) networks, that can handle both missing predictor and target values. The proposed LSTM algorithm is applied to model the progression of Alzheimer’s disease (AD) using six volumetric magnetic resonance imaging (MRI) biomarkers, i.e., volumes of ventricles, hippocampus, whole brain, fusiform, middle temporal gyrus, and entorhinal cortex, and it is compared to standard LSTM networks with data imputation and a parametric, regression-based DPM method. The results show that the proposed algorithm achieves a significantly lower mean absolute error (MAE) than the alternatives with p < 0.05 using Wilcoxon signed rank test in predicting values of almost all of the MRI biomarkers. Moreover, a linear discriminant analysis (LDA) classifier applied to the predicted biomarker values produces a significantly larger area under the receiver operating characteristic curve (AUC) of 0.90 vs. at most 0.84 with p < 0.001 using McNemar’s test for clinical diagnosis of AD. Inspection of MAE curves as a function of the amount of missing data reveals that the proposed LSTM algorithm achieves the best performance up until more than 74% missing values. Finally, it is illustrated how the method can successfully be applied to data with varying time intervals. This paper shows that built-in handling of missing values in training an LSTM network benefits the application of RNNs in neurodegenerative disease progression modeling in longitudinal cohorts. Disease progression modeling (DPM) using longitudinal data is a challenging machine learning task. Existing DPM algorithms neglect temporal dependencies among measurements, make parametric assumptions about biomarker trajectories, do not model multiple biomarkers jointly, and need an alignment of subjects' trajectories. In this paper, recurrent neural networks (RNNs) are utilized to address these issues. However, in many cases, longitudinal cohorts contain incomplete data, which hinders the application of standard RNNs and requires a pre-processing step such as imputation of the missing values. Instead, we propose a generalized training rule for the most widely used RNN architecture, long short-term memory (LSTM) networks, that can handle both missing predictor and target values. The proposed LSTM algorithm is applied to model the progression of Alzheimer's disease (AD) using six volumetric magnetic resonance imaging (MRI) biomarkers, i.e., volumes of ventricles, hippocampus, whole brain, fusiform, middle temporal gyrus, and entorhinal cortex, and it is compared to standard LSTM networks with data imputation and a parametric, regression-based DPM method. The results show that the proposed algorithm achieves a significantly lower mean absolute error (MAE) than the alternatives with p < 0.05 using Wilcoxon signed rank test in predicting values of almost all of the MRI biomarkers. Moreover, a linear discriminant analysis (LDA) classifier applied to the predicted biomarker values produces a significantly larger area under the receiver operating characteristic curve (AUC) of 0.90 vs. at most 0.84 with p < 0.001 using McNemar's test for clinical diagnosis of AD. Inspection of MAE curves as a function of the amount of missing data reveals that the proposed LSTM algorithm achieves the best performance up until more than 74% missing values. Finally, it is illustrated how the method can successfully be applied to data with varying time intervals. This paper shows that built-in handling of missing values in training an LSTM network benefits the application of RNNs in neurodegenerative disease progression modeling in longitudinal cohorts.Disease progression modeling (DPM) using longitudinal data is a challenging machine learning task. Existing DPM algorithms neglect temporal dependencies among measurements, make parametric assumptions about biomarker trajectories, do not model multiple biomarkers jointly, and need an alignment of subjects' trajectories. In this paper, recurrent neural networks (RNNs) are utilized to address these issues. However, in many cases, longitudinal cohorts contain incomplete data, which hinders the application of standard RNNs and requires a pre-processing step such as imputation of the missing values. Instead, we propose a generalized training rule for the most widely used RNN architecture, long short-term memory (LSTM) networks, that can handle both missing predictor and target values. The proposed LSTM algorithm is applied to model the progression of Alzheimer's disease (AD) using six volumetric magnetic resonance imaging (MRI) biomarkers, i.e., volumes of ventricles, hippocampus, whole brain, fusiform, middle temporal gyrus, and entorhinal cortex, and it is compared to standard LSTM networks with data imputation and a parametric, regression-based DPM method. The results show that the proposed algorithm achieves a significantly lower mean absolute error (MAE) than the alternatives with p < 0.05 using Wilcoxon signed rank test in predicting values of almost all of the MRI biomarkers. Moreover, a linear discriminant analysis (LDA) classifier applied to the predicted biomarker values produces a significantly larger area under the receiver operating characteristic curve (AUC) of 0.90 vs. at most 0.84 with p < 0.001 using McNemar's test for clinical diagnosis of AD. Inspection of MAE curves as a function of the amount of missing data reveals that the proposed LSTM algorithm achieves the best performance up until more than 74% missing values. Finally, it is illustrated how the method can successfully be applied to data with varying time intervals. This paper shows that built-in handling of missing values in training an LSTM network benefits the application of RNNs in neurodegenerative disease progression modeling in longitudinal cohorts. •We propose a generalized algorithm to train LSTM networks robust to incomplete data.•We introduce an end-to-end approach for biomarker modeling and clinical status prediction.•It is applied to model Alzheimer’s disease progression using volumetric MRI biomarkers.•Our proposed algorithm predicts biomarker measurements with the lowest MAE.•This is the first time RNNs are applied for neurodegenerative disease progression modeling. Disease progression modeling (DPM) using longitudinal data is a challenging machine learning task. Existing DPM algorithms neglect temporal dependencies among measurements, make parametric assumptions about biomarker trajectories, do not model multiple biomarkers jointly, and need an alignment of subjects’ trajectories. In this paper, recurrent neural networks (RNNs) are utilized to address these issues. However, in many cases, longitudinal cohorts contain incomplete data, which hinders the application of standard RNNs and requires a pre-processing step such as imputation of the missing values. Instead, we propose a generalized training rule for the most widely used RNN architecture, long short-term memory (LSTM) networks, that can handle both missing predictor and target values. The proposed LSTM algorithm is applied to model the progression of Alzheimer’s disease (AD) using six volumetric magnetic resonance imaging (MRI) biomarkers, i.e., volumes of ventricles, hippocampus, whole brain, fusiform, middle temporal gyrus, and entorhinal cortex, and it is compared to standard LSTM networks with data imputation and a parametric, regression-based DPM method. The results show that the proposed algorithm achieves a significantly lower mean absolute error (MAE) than the alternatives with p < 0.05 using Wilcoxon signed rank test in predicting values of almost all of the MRI biomarkers. Moreover, a linear discriminant analysis (LDA) classifier applied to the predicted biomarker values produces a significantly larger area under the receiver operating characteristic curve (AUC) of 0.90 vs. at most 0.84 with p < 0.001 using McNemar’s test for clinical diagnosis of AD. Inspection of MAE curves as a function of the amount of missing data reveals that the proposed LSTM algorithm achieves the best performance up until more than 74% missing values. Finally, it is illustrated how the method can successfully be applied to data with varying time intervals. This paper shows that built-in handling of missing values in training an LSTM network benefits the application of RNNs in neurodegenerative disease progression modeling in longitudinal cohorts. |
Author | Mehdipour Ghazi, Mostafa Modat, Marc Ourselin, Sébastien Cardoso, M. Jorge Nielsen, Mads Pai, Akshay Sørensen, Lauge |
Author_xml | – sequence: 1 givenname: Mostafa surname: Mehdipour Ghazi fullname: Mehdipour Ghazi, Mostafa email: mehdipour@biomediq.com organization: Biomediq A/S, Copenhagen, Denmark – sequence: 2 givenname: Mads surname: Nielsen fullname: Nielsen, Mads organization: Biomediq A/S, Copenhagen, Denmark – sequence: 3 givenname: Akshay surname: Pai fullname: Pai, Akshay organization: Biomediq A/S, Copenhagen, Denmark – sequence: 4 givenname: M. Jorge orcidid: 0000-0003-1284-2558 surname: Cardoso fullname: Cardoso, M. Jorge organization: Centre for Medical Image Computing, University College London, London, UK – sequence: 5 givenname: Marc orcidid: 0000-0002-5277-8530 surname: Modat fullname: Modat, Marc organization: Centre for Medical Image Computing, University College London, London, UK – sequence: 6 givenname: Sébastien surname: Ourselin fullname: Ourselin, Sébastien organization: Centre for Medical Image Computing, University College London, London, UK – sequence: 7 givenname: Lauge orcidid: 0000-0002-1181-7150 surname: Sørensen fullname: Sørensen, Lauge organization: Biomediq A/S, Copenhagen, Denmark |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/30682584$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kctuFDEQRS0URB7wBUioJTZspvFr7G4kFqOIR6RI2SRry21XBw9uu7HdRLDiN_g9vgRPJskii6yqpDq3qnTvMToIMQBCrwluCSbi_badwDrdUkz6FpMWY_4MHREmyKrjlB089GR9iI5z3mKMJef4BTpkWHR03fEjdHOZtAsuXDcJzJIShNIEWJL2tZSbmL7nJsVhyaUpsXHBxGn2UKCxuugPzWaevTO6uBh2843__Q3cBOnfn7-5sS6DztDMKV4nyHkHTdGCr-deouej9hle3dUTdPX50-Xp19X5xZez0835yrBOlhXItRWDpaYH02tGpe34QKTsCKOGWE2M5JZygXsNHR8ZBWOkxuM4cKup1uwEvdvvrU_8WCAXNblswHsdIC5ZUSJ7ToWgfUXfPkK3cUmhfqcoJbwXgnTrSr25o5ah-q_m5Cadfql7SyvQ7wGTYs4JRmVcuXWoVKu9Iljt4lNbdRuf2sWnMFE1vqplj7T3659WfdyroBr500FS2TgIpoI11KJsdE_q_wO17bf4 |
CitedBy_id | crossref_primary_10_3390_make6010024 crossref_primary_10_1007_s11277_023_10346_y crossref_primary_10_3390_brainsci12091132 crossref_primary_10_1016_j_inffus_2022_11_028 crossref_primary_10_1016_j_compbiomed_2021_104935 crossref_primary_10_2174_0929867328666210405114938 crossref_primary_10_1016_j_compbiomed_2024_108869 crossref_primary_10_1038_s41598_023_42796_6 crossref_primary_10_1016_j_neuroimage_2021_118514 crossref_primary_10_3390_app13158953 crossref_primary_10_1016_j_media_2021_102051 crossref_primary_10_1186_s12879_024_09892_y crossref_primary_10_1007_s10072_024_07649_8 crossref_primary_10_1016_j_mtcomm_2022_104092 crossref_primary_10_1016_j_neuroimage_2020_117460 crossref_primary_10_1063_5_0011697 crossref_primary_10_1016_j_bspc_2021_102729 crossref_primary_10_1016_j_eswa_2024_124780 crossref_primary_10_1016_j_media_2022_102571 crossref_primary_10_1109_TMI_2023_3312524 crossref_primary_10_1109_TNNLS_2022_3177366 crossref_primary_10_3389_fnins_2022_951508 crossref_primary_10_3389_fnagi_2024_1345417 crossref_primary_10_1016_j_neuroimage_2023_119892 crossref_primary_10_1016_j_media_2020_101953 crossref_primary_10_1109_ACCESS_2022_3160841 crossref_primary_10_1111_ejn_15446 crossref_primary_10_1001_jamanetworkopen_2023_42203 crossref_primary_10_1038_s42256_023_00633_5 crossref_primary_10_1007_s11042_024_19425_z crossref_primary_10_1162_imag_a_00294 crossref_primary_10_1038_s41380_022_01635_2 crossref_primary_10_1016_j_media_2021_102189 crossref_primary_10_1016_j_media_2022_102643 crossref_primary_10_1016_j_neunet_2022_03_016 crossref_primary_10_1016_j_media_2024_103135 crossref_primary_10_1016_j_ajp_2023_103705 crossref_primary_10_1016_j_cmpb_2020_105348 crossref_primary_10_1016_j_neuroimage_2024_120695 crossref_primary_10_1109_JBHI_2020_3027443 crossref_primary_10_3389_fnins_2019_01053 crossref_primary_10_1007_s11571_023_09981_9 crossref_primary_10_1186_s13195_020_00612_7 crossref_primary_10_1016_j_neuroimage_2020_117203 crossref_primary_10_1055_s_0040_1721780 crossref_primary_10_3390_diagnostics11112103 crossref_primary_10_1016_j_eswa_2023_120761 crossref_primary_10_1109_ACCESS_2025_3548173 crossref_primary_10_1109_TMI_2020_3041227 crossref_primary_10_1007_s10462_023_10561_w crossref_primary_10_1016_j_bspc_2024_107253 crossref_primary_10_3233_IDA_230220 crossref_primary_10_1016_j_artmed_2023_102587 crossref_primary_10_3348_kjr_2023_0393 crossref_primary_10_1016_j_jksuci_2020_12_009 crossref_primary_10_1016_j_physa_2019_122699 crossref_primary_10_1109_TMI_2022_3151118 crossref_primary_10_1016_j_arr_2022_101614 crossref_primary_10_1016_j_knosys_2020_106688 crossref_primary_10_1109_TMI_2022_3166131 crossref_primary_10_1038_s41598_024_74508_z crossref_primary_10_3390_diagnostics13020288 crossref_primary_10_1007_s10462_023_10415_5 crossref_primary_10_1109_JBHI_2020_3042447 crossref_primary_10_1109_ACCESS_2024_3454709 crossref_primary_10_1145_3545118 crossref_primary_10_1016_j_bspc_2023_105767 crossref_primary_10_1109_JBHI_2022_3208517 crossref_primary_10_1080_20479700_2023_2175414 crossref_primary_10_1016_j_compmedimag_2024_102404 crossref_primary_10_1016_j_neuroimage_2021_118143 crossref_primary_10_1016_j_artmed_2022_102332 crossref_primary_10_1007_s12559_023_10169_w crossref_primary_10_1109_TKDE_2024_3385712 |
Cites_doi | 10.1016/j.neuroimage.2016.06.049 10.2307/3001968 10.1162/neco.1997.9.8.1735 10.1007/BF03256467 10.1023/A:1010920819831 10.1007/BF02295996 10.1016/j.neuroimage.2012.07.059 10.1016/j.neurobiolaging.2013.04.006 10.1016/j.neuroimage.2015.01.048 10.1212/WNL.0b013e3181cb3e25 10.1162/neco.1989.1.2.263 10.1016/S1474-4422(15)00135-0 10.1016/j.jalz.2013.10.003 10.1109/TNNLS.2016.2582924 10.1097/WCO.0000000000000460 10.1109/72.963769 10.1038/s41598-018-24271-9 10.2217/nmt.11.11 |
ContentType | Journal Article |
Copyright | 2019 Elsevier B.V. Copyright © 2019 Elsevier B.V. All rights reserved. Copyright Elsevier BV Apr 2019 |
Copyright_xml | – notice: 2019 Elsevier B.V. – notice: Copyright © 2019 Elsevier B.V. All rights reserved. – notice: Copyright Elsevier BV Apr 2019 |
CorporateAuthor | for the Alzheimer’s Disease Neuroimaging Initiative Alzheimer’s Disease Neuroimaging Initiative |
CorporateAuthor_xml | – name: for the Alzheimer’s Disease Neuroimaging Initiative – name: Alzheimer’s Disease Neuroimaging Initiative |
DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 7QO 8FD FR3 K9. NAPCQ P64 7X8 |
DOI | 10.1016/j.media.2019.01.004 |
DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Biotechnology Research Abstracts Technology Research Database Engineering Research Database ProQuest Health & Medical Complete (Alumni) Nursing & Allied Health Premium Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) ProQuest Health & Medical Complete (Alumni) Nursing & Allied Health Premium Engineering Research Database Biotechnology Research Abstracts Technology Research Database Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitleList | ProQuest Health & Medical Complete (Alumni) MEDLINE MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Medicine Engineering |
EISSN | 1361-8423 |
EndPage | 46 |
ExternalDocumentID | 30682584 10_1016_j_media_2019_01_004 S136184151830598X |
Genre | Research Support, U.S. Gov't, Non-P.H.S Research Support, Non-U.S. Gov't Journal Article Research Support, N.I.H., Extramural |
GrantInformation_xml | – fundername: CIHR – fundername: NIA NIH HHS grantid: U01 AG024904 |
GroupedDBID | --- --K --M .~1 0R~ 1B1 1~. 1~5 29M 4.4 457 4G. 53G 5GY 5VS 7-5 71M 8P~ AACTN AAEDT AAEDW AAIAV AAIKJ AAKOC AALRI AAOAW AAQFI AAQXK AAXUO AAYFN ABBOA ABBQC ABJNI ABLVK ABMAC ABMZM ABXDB ABYKQ ACDAQ ACGFS ACIUM ACIWK ACNNM ACPRK ACRLP ACZNC ADBBV ADEZE ADJOM ADMUD ADTZH AEBSH AECPX AEKER AENEX AFKWA AFRAH AFTJW AFXIZ AGHFR AGUBO AGYEJ AHJVU AHZHX AIALX AIEXJ AIKHN AITUG AJBFU AJOXV AJRQY ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ ANZVX AOUOD ASPBG AVWKF AXJTR AZFZN BJAXD BKOJK BLXMC BNPGV C45 CAG COF CS3 DU5 EBS EFJIC EFLBG EJD EO8 EO9 EP2 EP3 F5P FDB FEDTE FGOYB FIRID FNPLU FYGXN G-Q GBLVA GBOLZ HVGLF HX~ HZ~ IHE J1W JJJVA KOM LCYCR M41 MO0 N9A O-L O9- OAUVE OVD OZT P-8 P-9 P2P PC. Q38 R2- RIG ROL RPZ SDF SDG SDP SEL SES SEW SPC SPCBC SSH SST SSV SSZ T5K TEORI UHS ~G- AATTM AAXKI AAYWO AAYXX ABWVN ACIEU ACRPL ACVFH ADCNI ADNMO ADVLN AEIPS AEUPX AFJKZ AFPUW AGCQF AGQPQ AGRNS AIGII AIIUN AKBMS AKRWK AKYEP ANKPU APXCP CITATION CGR CUY CVF ECM EIF NPM 7QO 8FD EFKBS FR3 K9. NAPCQ P64 7X8 |
ID | FETCH-LOGICAL-c387t-e75d6bd2c9ec9a327d84b1778132c1da1c74d24609ae84f32ecc7a0ffb4da2aa3 |
IEDL.DBID | .~1 |
ISSN | 1361-8415 1361-8423 |
IngestDate | Fri Jul 11 12:00:14 EDT 2025 Sat Jul 26 03:25:24 EDT 2025 Wed Feb 19 02:32:03 EST 2025 Tue Jul 01 02:49:27 EDT 2025 Thu Apr 24 23:05:56 EDT 2025 Fri Feb 23 02:28:17 EST 2024 |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | Long short-term memory Disease progression modeling Recurrent neural networks Linear discriminant analysis Alzheimer’s disease Magnetic resonance imaging |
Language | English |
License | Copyright © 2019 Elsevier B.V. All rights reserved. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c387t-e75d6bd2c9ec9a327d84b1778132c1da1c74d24609ae84f32ecc7a0ffb4da2aa3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0003-1284-2558 0000-0002-5277-8530 0000-0002-1181-7150 |
PMID | 30682584 |
PQID | 2214966185 |
PQPubID | 2045428 |
PageCount | 8 |
ParticipantIDs | proquest_miscellaneous_2179426629 proquest_journals_2214966185 pubmed_primary_30682584 crossref_citationtrail_10_1016_j_media_2019_01_004 crossref_primary_10_1016_j_media_2019_01_004 elsevier_sciencedirect_doi_10_1016_j_media_2019_01_004 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | April 2019 2019-04-00 20190401 |
PublicationDateYYYYMMDD | 2019-04-01 |
PublicationDate_xml | – month: 04 year: 2019 text: April 2019 |
PublicationDecade | 2010 |
PublicationPlace | Netherlands |
PublicationPlace_xml | – name: Netherlands – name: Amsterdam |
PublicationTitle | Medical image analysis |
PublicationTitleAlternate | Med Image Anal |
PublicationYear | 2019 |
Publisher | Elsevier B.V Elsevier BV |
Publisher_xml | – name: Elsevier B.V – name: Elsevier BV |
References | Biagioni, Galvin (bib0002) 2011; 1 Gers, Schmidhuber, Cummins (bib0009) 1999; 2 McNemar (bib0018) 1947; 12 Donohue, Jacqmin-Gadda, Le Goff, Thomas, Raman, Gamst, Beckett, Jack, Weiner, Dartigues, Aisen (bib0006) 2014; 10 Gers, Schmidhuber (bib0008) 2001; 12 Hochreiter, Schmidhuber (bib0013) 1997; 9 Greff, Srivastava, Koutník, Steunebrink, Schmidhuber (bib0010) 2017; 28 Petersen, Aisen, Beckett, Donohue, Gamst, Harvey, Jack, Jagust, Shaw, Toga, Trojanowski, Weiner (bib0025) 2010; 74 Hand, Till (bib0012) 2001; 45 Yoon, Zame, van der Schaar (bib0029) 2018 Che, Purushotham, Cho, Sontag, Liu (bib0004) 2018; 8 Neil, Pfeiffer, Liu (bib0020) 2016 Wilcoxon (bib0026) 1945; 1 Mehdipour Ghazi, Nielsen, Pai, Cardoso, Modat, Ourselin, Sørensen (bib0019) 2018; abs/1808.05500 Parveen, Green (bib0023) 2002 Yau, Tudorascu, McDade, Ikonomovic, James, Minhas, Mowrey, Sheu, Snitz, Weissfeld (bib0028) 2015; 14 Oxtoby, Alexander (bib0021) 2017; 30 Pearlmutter (bib0024) 1989; 1 Baytas, Xiao, Zhang, Wang, Jain, Zhou (bib0001) 2017 Fjell, Westlye, Grydeland, Amlien, Espeseth, Reinvang, Raz, Holland, Dale, Walhovd (bib0007) 2013; 34 Oxtoby, Young, Fox, Daga, Cash, Ourselin, Schott, Alexander (bib0022) 2014 Marinescu, Oxtoby, Young, Bron, Toga, Weiner, Barkhof, Fox, Klein, Alexander (bib0016) 2018; abs/1805.03909 McKhann, Drachman, Folstein, Katzman, Price, Stadlan (bib0017) 1984; 34 Jedynak, Lang, Liu, Katz, Zhang, Wyman, Raunig, Jedynak, Caffo, Prince (bib0014) 2012; 63 Bron, Smits, Van Der Flier, Vrenken, Barkhof, Scheltens, Papma, Steketee, Orellana, Meijboom (bib0003) 2015; 111 Cho, Van Merriënboer, Gulcehre, Bahdanau, Bougares, Schwenk, Bengio (bib0005) 2014; abs/1406.1078 Wu, Rosa-Neto, Gauthier (bib0027) 2011; 15 Lipton, Kale, Wetzel (bib0015) 2016 Guerrero, Schmidt-Richberg, Ledig, Tong, Wolz, Rueckert (bib0011) 2016; 142 Wilcoxon (10.1016/j.media.2019.01.004_bib0026) 1945; 1 Parveen (10.1016/j.media.2019.01.004_bib0023) 2002 Marinescu (10.1016/j.media.2019.01.004_bib0016) 2018; abs/1805.03909 Bron (10.1016/j.media.2019.01.004_bib0003) 2015; 111 Lipton (10.1016/j.media.2019.01.004_bib0015) 2016 McNemar (10.1016/j.media.2019.01.004_bib0018) 1947; 12 Donohue (10.1016/j.media.2019.01.004_bib0006) 2014; 10 Che (10.1016/j.media.2019.01.004_bib0004) 2018; 8 Jedynak (10.1016/j.media.2019.01.004_bib0014) 2012; 63 Yau (10.1016/j.media.2019.01.004_bib0028) 2015; 14 Cho (10.1016/j.media.2019.01.004_bib0005) 2014; abs/1406.1078 Greff (10.1016/j.media.2019.01.004_bib0010) 2017; 28 McKhann (10.1016/j.media.2019.01.004_sbref0014) 1984; 34 Pearlmutter (10.1016/j.media.2019.01.004_bib0024) 1989; 1 Hand (10.1016/j.media.2019.01.004_bib0012) 2001; 45 Petersen (10.1016/j.media.2019.01.004_bib0025) 2010; 74 Biagioni (10.1016/j.media.2019.01.004_bib0002) 2011; 1 Fjell (10.1016/j.media.2019.01.004_bib0007) 2013; 34 Gers (10.1016/j.media.2019.01.004_bib0009) 1999; 2 Hochreiter (10.1016/j.media.2019.01.004_bib0013) 1997; 9 Oxtoby (10.1016/j.media.2019.01.004_bib0021) 2017; 30 Guerrero (10.1016/j.media.2019.01.004_bib0011) 2016; 142 Neil (10.1016/j.media.2019.01.004_bib0020) 2016 Wu (10.1016/j.media.2019.01.004_bib0027) 2011; 15 Gers (10.1016/j.media.2019.01.004_bib0008) 2001; 12 Oxtoby (10.1016/j.media.2019.01.004_bib0022) 2014 Yoon (10.1016/j.media.2019.01.004_bib0029) 2018 Mehdipour Ghazi (10.1016/j.media.2019.01.004_bib0019) 2018; abs/1808.05500 Baytas (10.1016/j.media.2019.01.004_bib0001) 2017 |
References_xml | – volume: 1 start-page: 80 year: 1945 end-page: 83 ident: bib0026 article-title: Individual comparisons by ranking methods publication-title: Biom. Bull. – volume: 12 start-page: 1333 year: 2001 end-page: 1340 ident: bib0008 article-title: LSTM Recurrent networks learn simple context-free and context-sensitive languages publication-title: IEEE Trans. Neural Netw. – volume: 111 start-page: 562 year: 2015 end-page: 579 ident: bib0003 article-title: Standardized evaluation of algorithms for computer-aided diagnosis of dementia based on structural MRI: the CADDementia challenge publication-title: Neuroimage – volume: 1 start-page: 263 year: 1989 end-page: 269 ident: bib0024 article-title: Learning state space trajectories in recurrent neural networks publication-title: Neural Comput. – volume: 74 start-page: 201 year: 2010 end-page: 209 ident: bib0025 article-title: Alzheimer’s disease neuroimaging initiative (ADNI): clinical characterization publication-title: Neurology – volume: 8 start-page: 6085 year: 2018 ident: bib0004 article-title: Recurrent neural networks for multivariate time series with missing values publication-title: Sci. Rep. – volume: 34 year: 1984 ident: bib0017 article-title: Clinical diagnosis of Alzheimer’s disease publication-title: Neurol. – volume: abs/1805.03909 year: 2018 ident: bib0016 article-title: TADPOLE Challenge: prediction of longitudinal evolution in Alzheimer's disease publication-title: CoRR – volume: 1 start-page: 127 year: 2011 end-page: 139 ident: bib0002 article-title: Using biomarkers to improve detection of Alzheimer’s disease publication-title: Neurodegener. Dis. Manag. – volume: 15 start-page: 313 year: 2011 end-page: 325 ident: bib0027 article-title: Use of biomarkers in clinical trials of alzheimer disease publication-title: Mol. Diagn. Ther. – volume: 28 start-page: 2222 year: 2017 end-page: 2232 ident: bib0010 article-title: LSTM: A search space odyssey publication-title: IEEE Trans. Neural Netw. Learn. Syst. – volume: 2 start-page: 850 year: 1999 end-page: 855 ident: bib0009 article-title: Learning to forget: continual prediction with LSTM publication-title: Proceedings of the 9th International Conference on Artificial Neural Networks (ICANN 99) – volume: 45 start-page: 171 year: 2001 end-page: 186 ident: bib0012 article-title: A simple generalisation of the area under the ROC curve for multiple class classification problems publication-title: Mach. Learn. – volume: 9 start-page: 1735 year: 1997 end-page: 1780 ident: bib0013 article-title: Long short-term memory publication-title: Neural Comput. – volume: abs/1406.1078 year: 2014 ident: bib0005 article-title: Learning phrase representations using RNN encoder-decoder for statistical machine translation publication-title: CoRR – start-page: 65 year: 2017 end-page: 74 ident: bib0001 article-title: Patient subtyping via time-aware LSTM networks publication-title: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining – start-page: 85 year: 2014 end-page: 94 ident: bib0022 article-title: Learning imaging biomarker trajectories from noisy Alzheimer’s disease data using a Bayesian multilevel model publication-title: Proceedings of the Bayesian and Graphical Models for Biomedical Imaging – volume: abs/1808.05500 year: 2018 ident: bib0019 article-title: Robust training of recurrent neural networks to handle missing data for disease progression modeling publication-title: CoRR – volume: 30 start-page: 371 year: 2017 ident: bib0021 article-title: Imaging plus x: multimodal models of neurodegenerative disease publication-title: Current Opin. Neurol. – volume: 142 start-page: 113 year: 2016 end-page: 125 ident: bib0011 article-title: Instantiated mixed effects modeling of Alzheimer’s disease markers publication-title: Neuroimage – start-page: 1189 year: 2002 end-page: 1195 ident: bib0023 article-title: Speech recognition with missing data using recurrent neural nets publication-title: Proceedings of the Advances in Neural Information Processing Systems – volume: 63 start-page: 1478 year: 2012 end-page: 1486 ident: bib0014 article-title: A computational neurodegenerative disease progression score: method and results with the Alzheimer’s disease neuroimaging initiative cohort publication-title: Neuroimage – year: 2016 ident: bib0015 article-title: Modeling missing data in clinical time series with RNNs publication-title: Proceedings of Machine Learning for Healthcare – volume: 10 start-page: S400 year: 2014 end-page: S410 ident: bib0006 article-title: Estimating long-term multivariate progression from short-term data publication-title: Alzheimer’s Dement. J. Alzheimer’s Assoc. – volume: 14 start-page: 804 year: 2015 end-page: 813 ident: bib0028 article-title: Longitudinal assessment of neuroimaging and clinical markers in autosomal dominant Alzheimer’s disease: a prospective cohort study publication-title: Lancet Neurol. – volume: 34 start-page: 2239 year: 2013 end-page: 2247 ident: bib0007 article-title: Critical ages in the life course of the adult brain: nonlinear subcortical aging publication-title: Neurobiol. of Ag. – year: 2018 ident: bib0029 article-title: Estimating missing data in temporal data streams using multi-directional recurrent neural networks publication-title: IEEE Trans. Biomed. Eng. – volume: 12 start-page: 153 year: 1947 end-page: 157 ident: bib0018 article-title: Note on the sampling error of the difference between correlated proportions or percentages publication-title: Psychom – start-page: 3882 year: 2016 end-page: 3890 ident: bib0020 article-title: Phased LSTM: Accelerating recurrent network training for long or event-based sequences publication-title: Proceedings of the Advances in Neural Information Processing Systems – volume: 142 start-page: 113 year: 2016 ident: 10.1016/j.media.2019.01.004_bib0011 article-title: Instantiated mixed effects modeling of Alzheimer’s disease markers publication-title: Neuroimage doi: 10.1016/j.neuroimage.2016.06.049 – start-page: 3882 year: 2016 ident: 10.1016/j.media.2019.01.004_bib0020 article-title: Phased LSTM: Accelerating recurrent network training for long or event-based sequences – volume: 1 start-page: 80 issue: 6 year: 1945 ident: 10.1016/j.media.2019.01.004_bib0026 article-title: Individual comparisons by ranking methods publication-title: Biom. Bull. doi: 10.2307/3001968 – volume: 9 start-page: 1735 issue: 8 year: 1997 ident: 10.1016/j.media.2019.01.004_bib0013 article-title: Long short-term memory publication-title: Neural Comput. doi: 10.1162/neco.1997.9.8.1735 – volume: abs/1805.03909 year: 2018 ident: 10.1016/j.media.2019.01.004_bib0016 article-title: TADPOLE Challenge: prediction of longitudinal evolution in Alzheimer's disease publication-title: CoRR – volume: 15 start-page: 313 issue: 6 year: 2011 ident: 10.1016/j.media.2019.01.004_bib0027 article-title: Use of biomarkers in clinical trials of alzheimer disease publication-title: Mol. Diagn. Ther. doi: 10.1007/BF03256467 – volume: 45 start-page: 171 issue: 2 year: 2001 ident: 10.1016/j.media.2019.01.004_bib0012 article-title: A simple generalisation of the area under the ROC curve for multiple class classification problems publication-title: Mach. Learn. doi: 10.1023/A:1010920819831 – volume: 12 start-page: 153 issue: 2 year: 1947 ident: 10.1016/j.media.2019.01.004_bib0018 article-title: Note on the sampling error of the difference between correlated proportions or percentages publication-title: Psychom doi: 10.1007/BF02295996 – start-page: 85 year: 2014 ident: 10.1016/j.media.2019.01.004_bib0022 article-title: Learning imaging biomarker trajectories from noisy Alzheimer’s disease data using a Bayesian multilevel model – volume: 63 start-page: 1478 issue: 3 year: 2012 ident: 10.1016/j.media.2019.01.004_bib0014 article-title: A computational neurodegenerative disease progression score: method and results with the Alzheimer’s disease neuroimaging initiative cohort publication-title: Neuroimage doi: 10.1016/j.neuroimage.2012.07.059 – volume: 34 start-page: 2239 issue: 10 year: 2013 ident: 10.1016/j.media.2019.01.004_bib0007 article-title: Critical ages in the life course of the adult brain: nonlinear subcortical aging publication-title: Neurobiol. of Ag. doi: 10.1016/j.neurobiolaging.2013.04.006 – volume: 34 issue: 7 year: 1984 ident: 10.1016/j.media.2019.01.004_sbref0014 article-title: Clinical diagnosis of Alzheimer’s disease publication-title: Neurol. – volume: abs/1808.05500 year: 2018 ident: 10.1016/j.media.2019.01.004_bib0019 article-title: Robust training of recurrent neural networks to handle missing data for disease progression modeling publication-title: CoRR – start-page: 65 year: 2017 ident: 10.1016/j.media.2019.01.004_bib0001 article-title: Patient subtyping via time-aware LSTM networks – volume: 111 start-page: 562 year: 2015 ident: 10.1016/j.media.2019.01.004_bib0003 article-title: Standardized evaluation of algorithms for computer-aided diagnosis of dementia based on structural MRI: the CADDementia challenge publication-title: Neuroimage doi: 10.1016/j.neuroimage.2015.01.048 – year: 2018 ident: 10.1016/j.media.2019.01.004_bib0029 article-title: Estimating missing data in temporal data streams using multi-directional recurrent neural networks publication-title: IEEE Trans. Biomed. Eng. – start-page: 1189 year: 2002 ident: 10.1016/j.media.2019.01.004_bib0023 article-title: Speech recognition with missing data using recurrent neural nets – volume: 74 start-page: 201 issue: 3 year: 2010 ident: 10.1016/j.media.2019.01.004_bib0025 article-title: Alzheimer’s disease neuroimaging initiative (ADNI): clinical characterization publication-title: Neurology doi: 10.1212/WNL.0b013e3181cb3e25 – year: 2016 ident: 10.1016/j.media.2019.01.004_bib0015 article-title: Modeling missing data in clinical time series with RNNs – volume: 1 start-page: 263 issue: 2 year: 1989 ident: 10.1016/j.media.2019.01.004_bib0024 article-title: Learning state space trajectories in recurrent neural networks publication-title: Neural Comput. doi: 10.1162/neco.1989.1.2.263 – volume: abs/1406.1078 year: 2014 ident: 10.1016/j.media.2019.01.004_bib0005 article-title: Learning phrase representations using RNN encoder-decoder for statistical machine translation publication-title: CoRR – volume: 14 start-page: 804 issue: 8 year: 2015 ident: 10.1016/j.media.2019.01.004_bib0028 article-title: Longitudinal assessment of neuroimaging and clinical markers in autosomal dominant Alzheimer’s disease: a prospective cohort study publication-title: Lancet Neurol. doi: 10.1016/S1474-4422(15)00135-0 – volume: 10 start-page: S400 issue: 5 year: 2014 ident: 10.1016/j.media.2019.01.004_bib0006 article-title: Estimating long-term multivariate progression from short-term data publication-title: Alzheimer’s Dement. J. Alzheimer’s Assoc. doi: 10.1016/j.jalz.2013.10.003 – volume: 2 start-page: 850 year: 1999 ident: 10.1016/j.media.2019.01.004_bib0009 article-title: Learning to forget: continual prediction with LSTM – volume: 28 start-page: 2222 issue: 10 year: 2017 ident: 10.1016/j.media.2019.01.004_bib0010 article-title: LSTM: A search space odyssey publication-title: IEEE Trans. Neural Netw. Learn. Syst. doi: 10.1109/TNNLS.2016.2582924 – volume: 30 start-page: 371 issue: 4 year: 2017 ident: 10.1016/j.media.2019.01.004_bib0021 article-title: Imaging plus x: multimodal models of neurodegenerative disease publication-title: Current Opin. Neurol. doi: 10.1097/WCO.0000000000000460 – volume: 12 start-page: 1333 issue: 6 year: 2001 ident: 10.1016/j.media.2019.01.004_bib0008 article-title: LSTM Recurrent networks learn simple context-free and context-sensitive languages publication-title: IEEE Trans. Neural Netw. doi: 10.1109/72.963769 – volume: 8 start-page: 6085 issue: 1 year: 2018 ident: 10.1016/j.media.2019.01.004_bib0004 article-title: Recurrent neural networks for multivariate time series with missing values publication-title: Sci. Rep. doi: 10.1038/s41598-018-24271-9 – volume: 1 start-page: 127 issue: 2 year: 2011 ident: 10.1016/j.media.2019.01.004_bib0002 article-title: Using biomarkers to improve detection of Alzheimer’s disease publication-title: Neurodegener. Dis. Manag. doi: 10.2217/nmt.11.11 |
SSID | ssj0007440 |
Score | 2.5326293 |
Snippet | •We propose a generalized algorithm to train LSTM networks robust to incomplete data.•We introduce an end-to-end approach for biomarker modeling and clinical... Disease progression modeling (DPM) using longitudinal data is a challenging machine learning task. Existing DPM algorithms neglect temporal dependencies among... |
SourceID | proquest pubmed crossref elsevier |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 39 |
SubjectTerms | Aged Algorithms Alzheimer Disease - diagnostic imaging Alzheimer Disease - pathology Alzheimer's disease Biomarkers Biomarkers - analysis Brain Cases (containers) Cortex (entorhinal) Cortex (temporal) Discriminant analysis Disease Progression Disease progression modeling Female Humans Inspection Learning algorithms Linear discriminant analysis Long short-term memory Machine learning Magnetic Resonance Imaging Male Mathematical models Missing data Modelling Neural networks Neural Networks, Computer Neurodegenerative diseases Neuroimaging Neurological diseases NMR Nuclear magnetic resonance Rank tests Recurrent neural networks Regression analysis Temporal gyrus Training Trajectories |
Title | Training recurrent neural networks robust to incomplete data: Application to Alzheimer’s disease progression modeling |
URI | https://dx.doi.org/10.1016/j.media.2019.01.004 https://www.ncbi.nlm.nih.gov/pubmed/30682584 https://www.proquest.com/docview/2214966185 https://www.proquest.com/docview/2179426629 |
Volume | 53 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3NbtQwEB5VRUJwqKD8bSmVkTgSduM4scMtVJTlp73Qor1ZtuOIRSVb7WaFxAHxGrweT8KM42xBgh44RYntxPKMPTPx528AnngMjPOJs4lpBE-EFC6xaKUTLnNvi1w0WUjad3xSTM_Em1k-24LD4SwMwSrj2t-v6WG1jk_GcTTHF_P5-H2aUbIStFgKdbZUMzrBLiRp-bNvlzAPIsDrz16lCdUemIcCxiucziB8Vxm4O2O2tr9Yp395n8EKHd2Cneg-sqrv4W3Y8u0u3PyNVHAXrh_H7fI78OU0JoBgS_qtTkRMjAgs8Q1tD_9eseXCrlcd6xaMeBqIK7jzjHCjz1l1ublN5dX5149-_tkvf37_sWJxZ4cFgFdP7sFCWh383F04O3p5ejhNYqaFxGVKdomXeV3YmrvSu9JkXNZKEC-VwljVpbVJnRQ1F8WkNF6h_DgKXppJ01hRG25Mdg-220XrHwCTqraZsCUGXh5j01xZdEgwKuKqKJ1szAj4MMLaRRpyyoZxrge82ScdxKJJLHqSahTLCJ5uGl30LBxXVy8G0ek_lEmjnbi64f4gaB3n8kpzjlEkujEqH8HjTTHOQtpaMa1frLEOrWvo6_ByBPd7Bdl0FIMyDMOV2PvfXj2EG3TX44X2Ybtbrv0jdIU6exB0_QCuVa_fTk_w-urFuw_VL6wUC6I |
linkProvider | Elsevier |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Rb9MwED6NIcF4QDBgFAYYiUdCG8eJHd6qianAuhc6qW-W7TiiaKRTmwqJB8Tf4O_xS7hznA4k2AOvsZ1YvrPvLvf5O4AXHgPjfORsYmrBEyGFSyxa6YTL3NsiF3UWivZNT4vJmXg3z-c7cNTfhSFYZTz7uzM9nNbxyTCu5vBisRh-SDMqVoIWS6HOlmp-Da4L3L5UxuDVt0ucBzHgdZev0oS699RDAeQVrmcQwKsM5J2xXNtfzNO_3M9gho7vwO3oP7JxN8W7sOObfbj1G6vgPtyYxnz5PfgyixUg2Ir-qxMTEyMGS3xD0-G_12y1tJt1y9olI6IGIgtuPSPg6Gs2vsxuU_v4_OtHv_jsVz-__1izmNphAeHVsXuwUFcHP3cfzo7fzI4mSSy1kLhMyTbxMq8KW3FXeleajMtKCSKmUhisurQyqZOi4qIYlcYrFCBHyUszqmsrKsONyR7AbrNs_ENgUlU2E7bEyMtjcJorix4JhkVcFaWTtRkA71dYu8hDTuUwznUPOPukg1g0iUWPUo1iGcDL7aCLjobj6u5FLzr9hzZpNBRXDzzsBa3jZl5rzjGMRD9G5QN4vm3GbUi5FdP45Qb70MGGzg4vB3DQKch2ohiVYRyuxKP_ndUzuDmZTU_0ydvT949hj1o68NAh7LarjX-CflFrnwa9_wVNbQuh |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Training+recurrent+neural+networks+robust+to+incomplete+data%3A+Application+to+Alzheimer%E2%80%99s+disease+progression+modeling&rft.jtitle=Medical+image+analysis&rft.au=Ghazi%2C+Mostafa+Mehdipour&rft.au=Nielsen%2C+Mads&rft.au=Pai%2C+Akshay&rft.au=Cardoso%2C+M+Jorge&rft.date=2019-04-01&rft.pub=Elsevier+BV&rft.issn=1361-8415&rft.eissn=1361-8423&rft.volume=53&rft.spage=39&rft_id=info:doi/10.1016%2Fj.media.2019.01.004&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1361-8415&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1361-8415&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1361-8415&client=summon |