Triple-Memory Networks: A Brain-Inspired Method for Continual Learning
Continual acquisition of novel experience without interfering with previously learned knowledge, i.e., continual learning, is critical for artificial neural networks, while limited by catastrophic forgetting. A neural network adjusts its parameters when learning a new task but then fails to conduct...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. 33; no. 5; pp. 1925 - 1934 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.05.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 2162-237X 2162-2388 2162-2388 |
DOI | 10.1109/TNNLS.2021.3111019 |
Cover
Loading…
Abstract | Continual acquisition of novel experience without interfering with previously learned knowledge, i.e., continual learning, is critical for artificial neural networks, while limited by catastrophic forgetting. A neural network adjusts its parameters when learning a new task but then fails to conduct the old tasks well. By contrast, the biological brain can effectively address catastrophic forgetting through consolidating memories as more specific or more generalized forms to complement each other, which is achieved in the interplay of the hippocampus and neocortex, mediated by the prefrontal cortex. Inspired by such a brain strategy, we propose a novel approach named triple-memory networks (TMNs) for continual learning. TMNs model the interplay of the three brain regions as a triple-network architecture of generative adversarial networks (GANs). The input information is encoded as specific representations of data distributions in a generator, or generalized knowledge of solving tasks in a discriminator and a classifier, with implementing appropriate brain-inspired algorithms to alleviate catastrophic forgetting in each module. Particularly, the generator replays generated data of the learned tasks to the discriminator and the classifier, both of which are implemented with a weight consolidation regularizer to complement the lost information in the generation process. TMNs achieve the state-of-the-art performance of generative memory replay on a variety of class-incremental learning benchmarks on MNIST, SVHN, CIFAR-10, and ImageNet-50. |
---|---|
AbstractList | Continual acquisition of novel experience without interfering with previously learned knowledge, i.e., continual learning, is critical for artificial neural networks, while limited by catastrophic forgetting. A neural network adjusts its parameters when learning a new task but then fails to conduct the old tasks well. By contrast, the biological brain can effectively address catastrophic forgetting through consolidating memories as more specific or more generalized forms to complement each other, which is achieved in the interplay of the hippocampus and neocortex, mediated by the prefrontal cortex. Inspired by such a brain strategy, we propose a novel approach named triple-memory networks (TMNs) for continual learning. TMNs model the interplay of the three brain regions as a triple-network architecture of generative adversarial networks (GANs). The input information is encoded as specific representations of data distributions in a generator, or generalized knowledge of solving tasks in a discriminator and a classifier, with implementing appropriate brain-inspired algorithms to alleviate catastrophic forgetting in each module. Particularly, the generator replays generated data of the learned tasks to the discriminator and the classifier, both of which are implemented with a weight consolidation regularizer to complement the lost information in the generation process. TMNs achieve the state-of-the-art performance of generative memory replay on a variety of class-incremental learning benchmarks on MNIST, SVHN, CIFAR-10, and ImageNet-50. Continual acquisition of novel experience without interfering with previously learned knowledge, i.e., continual learning, is critical for artificial neural networks, while limited by catastrophic forgetting. A neural network adjusts its parameters when learning a new task but then fails to conduct the old tasks well. By contrast, the biological brain can effectively address catastrophic forgetting through consolidating memories as more specific or more generalized forms to complement each other, which is achieved in the interplay of the hippocampus and neocortex, mediated by the prefrontal cortex. Inspired by such a brain strategy, we propose a novel approach named triple-memory networks (TMNs) for continual learning. TMNs model the interplay of the three brain regions as a triple-network architecture of generative adversarial networks (GANs). The input information is encoded as specific representations of data distributions in a generator, or generalized knowledge of solving tasks in a discriminator and a classifier, with implementing appropriate brain-inspired algorithms to alleviate catastrophic forgetting in each module. Particularly, the generator replays generated data of the learned tasks to the discriminator and the classifier, both of which are implemented with a weight consolidation regularizer to complement the lost information in the generation process. TMNs achieve the state-of-the-art performance of generative memory replay on a variety of class-incremental learning benchmarks on MNIST, SVHN, CIFAR-10, and ImageNet-50.Continual acquisition of novel experience without interfering with previously learned knowledge, i.e., continual learning, is critical for artificial neural networks, while limited by catastrophic forgetting. A neural network adjusts its parameters when learning a new task but then fails to conduct the old tasks well. By contrast, the biological brain can effectively address catastrophic forgetting through consolidating memories as more specific or more generalized forms to complement each other, which is achieved in the interplay of the hippocampus and neocortex, mediated by the prefrontal cortex. Inspired by such a brain strategy, we propose a novel approach named triple-memory networks (TMNs) for continual learning. TMNs model the interplay of the three brain regions as a triple-network architecture of generative adversarial networks (GANs). The input information is encoded as specific representations of data distributions in a generator, or generalized knowledge of solving tasks in a discriminator and a classifier, with implementing appropriate brain-inspired algorithms to alleviate catastrophic forgetting in each module. Particularly, the generator replays generated data of the learned tasks to the discriminator and the classifier, both of which are implemented with a weight consolidation regularizer to complement the lost information in the generation process. TMNs achieve the state-of-the-art performance of generative memory replay on a variety of class-incremental learning benchmarks on MNIST, SVHN, CIFAR-10, and ImageNet-50. |
Author | Li, Qian Zhu, Jun Su, Hang Lei, Bo Wang, Liyuan Zhong, Yi |
Author_xml | – sequence: 1 givenname: Liyuan orcidid: 0000-0002-3869-8155 surname: Wang fullname: Wang, Liyuan organization: School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China – sequence: 2 givenname: Bo surname: Lei fullname: Lei, Bo organization: School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China – sequence: 3 givenname: Qian orcidid: 0000-0001-7317-1570 surname: Li fullname: Li, Qian organization: School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China – sequence: 4 givenname: Hang surname: Su fullname: Su, Hang organization: Department of Computer Science and Technology, THBI Laboratory, BNRist Center, Institute for AI, Tsinghua University, Beijing, China – sequence: 5 givenname: Jun orcidid: 0000-0002-6254-2388 surname: Zhu fullname: Zhu, Jun email: dcszj@tsinghua.edu.cn organization: Department of Computer Science and Technology, THBI Laboratory, BNRist Center, Institute for AI, Tsinghua University, Beijing, China – sequence: 6 givenname: Yi surname: Zhong fullname: Zhong, Yi email: zhongyithu@tsinghua.edu.cn organization: School of Life Sciences, IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/34529579$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kUtPGzEURi0E4lX-AEjVSN2wmdRvj7ujES8ppItmwc5yPHdaw8QO9owq_n0NCSxY4I2vrHOurO87QrshBkDolOAJIVh_X8zns98TiimZMFJeiN5Bh5RIWlPWNLvvs7o_QCc5P-ByJBaS6310wLigWih9iK4Wya97qO9gFdNzNYfhX0yP-Ud1Uf1M1of6NuS1T9BWdzD8jW3VxVRNYxh8GG1fzcCm4MOfL2ivs32Gk-19jBZXl4vpTT37dX07vZjVjhM11I1aCt0QrphjuoyaulYqLa0kgoLgS6EEBoGllLZ1jWPLtmMg286JtnGCHaPzzdp1ik8j5MGsfHbQ9zZAHLOhQnGOuSK4oN8-oA9xTKF8zlApNOOMSF6or1tqXK6gNevkVzY9m7eACtBsAJdizgk64_xgB18SKPH0hmDzUod5rcO81GG2dRSVflDftn8qnW0kDwDvghYcU4bZf-Ggkx8 |
CODEN | ITNNAL |
CitedBy_id | crossref_primary_10_1109_TNNLS_2024_3368341 crossref_primary_10_1007_s11431_023_2650_x crossref_primary_10_1515_revneuro_2022_0137 crossref_primary_10_1109_ACCESS_2024_3369488 crossref_primary_10_1007_s12652_023_04686_7 crossref_primary_10_1016_j_compbiomed_2024_109028 crossref_primary_10_1016_j_displa_2025_102992 crossref_primary_10_1109_TNNLS_2023_3242448 crossref_primary_10_1109_TPAMI_2024_3429383 crossref_primary_10_1007_s00521_024_09542_z crossref_primary_10_1016_j_neures_2022_12_024 crossref_primary_10_1007_s13042_023_01922_6 crossref_primary_10_1109_TKDE_2024_3419449 crossref_primary_10_1016_j_eng_2025_01_012 crossref_primary_10_1007_s10489_023_04701_6 crossref_primary_10_1007_s12559_024_10363_4 crossref_primary_10_1109_ACCESS_2024_3376441 crossref_primary_10_1109_TKDE_2024_3451161 crossref_primary_10_3389_fncom_2023_1092185 crossref_primary_10_1109_TNNLS_2023_3347477 crossref_primary_10_1109_TPAMI_2024_3367329 crossref_primary_10_1109_TMM_2023_3330082 crossref_primary_10_1109_TPAMI_2024_3446949 crossref_primary_10_1109_TNNLS_2022_3217403 |
Cites_doi | 10.1016/j.neuron.2019.01.044 10.1038/nrn1607 10.1038/nrn2822 10.1002/1098-1063(2000)10:4<438::AID-HIPO10>3.0.CO;2-3 10.1037/a0033812 10.1126/science.1101864 10.1609/aaai.v35i10.17037 10.1016/j.neuron.2012.12.002 10.1609/aaai.v32i1.11651 10.1016/j.tics.2018.11.005 10.3156/jsoft.29.5_177_2 10.1126/science.1173215 10.1073/pnas.1611835114 10.1016/j.neunet.2019.01.012 10.1109/CVPR.2019.01158 10.1007/978-3-030-01252-6_33 10.1002/hipo.20167 10.1016/s0079-7421(08)60536-8 10.1109/TNNLS.2019.2958324 10.1016/j.cell.2016.10.021 10.1109/CVPR.2017.587 10.1007/978-3-030-01258-8_15 10.1007/978-3-030-01219-9_9 10.1037/0033-295X.102.3.419 10.7554/eLife.51005 10.4324/9781410612403 10.1038/nature08577 10.1007/s11263-015-0816-y 10.3389/fnsys.2013.00074 10.1038/nature15257 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
DBID | 97E RIA RIE AAYXX CITATION CGR CUY CVF ECM EIF NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
DOI | 10.1109/TNNLS.2021.3111019 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Calcium & Calcified Tissue Abstracts Ceramic Abstracts Chemoreception Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Neurosciences Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Materials Research Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Materials Business File Aerospace Database Engineered Materials Abstracts Biotechnology Research Abstracts Chemoreception Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Civil Engineering Abstracts Aluminium Industry Abstracts Electronics & Communications Abstracts Ceramic Abstracts Neurosciences Abstracts METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Solid State and Superconductivity Abstracts Engineering Research Database Calcium & Calcified Tissue Abstracts Corrosion Abstracts MEDLINE - Academic |
DatabaseTitleList | Materials Research Database MEDLINE MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 3 dbid: RIE name: IEEE/IET Electronic Library (IEL) (UW System Shared) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 2162-2388 |
EndPage | 1934 |
ExternalDocumentID | 34529579 10_1109_TNNLS_2021_3111019 9540230 |
Genre | orig-research Journal Article |
GrantInformation_xml | – fundername: NSF of China grantid: 62061136001; 61620106010; U19B2034; U181146; 62076145 funderid: 10.13039/501100001809 – fundername: Beijing Academy of Artificial Intelligence (BAAI) – fundername: NVIDIA NVAIL Program with GPU/DGX Acceleration funderid: 10.13039/100007065 – fundername: Tsinghua-Peking Joint Center for Life Sciences funderid: 10.13039/501100011620 – fundername: Tsinghua-Huawei Joint Research Program – fundername: Beijing NSF grantid: JQ19016 – fundername: Tsinghua Institute for Guo Qiang |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK ACPRK AENEX AFRAH AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD IFIPE IPLJI JAVBF M43 MS~ O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION RIG CGR CUY CVF ECM EIF NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
ID | FETCH-LOGICAL-c417t-87b5981473c39b5992cd6796a6152e54b5750e50666adc8c3bdf3e6dfc5d8c53 |
IEDL.DBID | RIE |
ISSN | 2162-237X 2162-2388 |
IngestDate | Fri Jul 11 10:46:50 EDT 2025 Mon Jun 30 06:42:12 EDT 2025 Thu Jan 02 22:53:55 EST 2025 Tue Jul 01 00:27:42 EDT 2025 Thu Apr 24 23:03:49 EDT 2025 Wed Aug 27 02:40:11 EDT 2025 |
IsPeerReviewed | false |
IsScholarly | true |
Issue | 5 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c417t-87b5981473c39b5992cd6796a6152e54b5750e50666adc8c3bdf3e6dfc5d8c53 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0002-6254-2388 0000-0001-7317-1570 0000-0002-3869-8155 |
PMID | 34529579 |
PQID | 2659343164 |
PQPubID | 85436 |
PageCount | 10 |
ParticipantIDs | proquest_journals_2659343164 crossref_citationtrail_10_1109_TNNLS_2021_3111019 crossref_primary_10_1109_TNNLS_2021_3111019 pubmed_primary_34529579 ieee_primary_9540230 proquest_miscellaneous_2574404710 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-05-01 |
PublicationDateYYYYMMDD | 2022-05-01 |
PublicationDate_xml | – month: 05 year: 2022 text: 2022-05-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: Piscataway |
PublicationTitle | IEEE transaction on neural networks and learning systems |
PublicationTitleAbbrev | TNNLS |
PublicationTitleAlternate | IEEE Trans Neural Netw Learn Syst |
PublicationYear | 2022 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref35 ref12 ref34 ref15 Rusu (ref23) 2016 ref36 ref31 ref30 ref11 ref33 ref10 ref32 ref2 Shin (ref3) ref1 ref16 Zenke (ref14); 70 ref19 ref18 Seff (ref41) 2017 van de Ven (ref24) 2019 Krizhevsky (ref44) 2009 Kemker (ref20) 2017 Serrà (ref17) 2018 Chaudhry (ref47) 2018 Achille (ref38) 2017 ref45 ref26 ref25 ref22 Radford (ref46) 2015 ref28 ref27 Odena (ref37) ref29 ref8 Netzer (ref43) ref7 Li (ref39) ref9 ref4 ref6 LeCun (ref42) 1998 ref5 Wu (ref21) Gulrajani (ref40) |
References_xml | – ident: ref28 doi: 10.1016/j.neuron.2019.01.044 – ident: ref5 doi: 10.1038/nrn1607 – ident: ref6 doi: 10.1038/nrn2822 – year: 2009 ident: ref44 publication-title: Learning Multiple Layers of Features From Tiny Images – year: 2017 ident: ref41 article-title: Continual learning in generative adversarial nets publication-title: arXiv:1705.08395 – ident: ref32 doi: 10.1002/1098-1063(2000)10:4<438::AID-HIPO10>3.0.CO;2-3 – start-page: 4088 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref39 article-title: Triple generative adversarial nets – ident: ref9 doi: 10.1037/a0033812 – start-page: 5962 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref21 article-title: Memory replay GANs: Learning to generate new categories without forgetting – ident: ref33 doi: 10.1126/science.1101864 – start-page: 2990 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref3 article-title: Continual learning with deep generative replay – year: 2016 ident: ref23 article-title: Progressive neural networks publication-title: arXiv:1606.04671 – ident: ref26 doi: 10.1609/aaai.v35i10.17037 – ident: ref31 doi: 10.1016/j.neuron.2012.12.002 – year: 2019 ident: ref24 article-title: Three scenarios for continual learning publication-title: arXiv:1904.07734 – start-page: 5767 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref40 article-title: Improved training of Wasserstein GANs – year: 2018 ident: ref47 article-title: Efficient lifelong learning with A-GEM publication-title: arXiv:1812.00420 – ident: ref4 doi: 10.1609/aaai.v32i1.11651 – ident: ref11 doi: 10.1016/j.tics.2018.11.005 – ident: ref12 doi: 10.3156/jsoft.29.5_177_2 – ident: ref29 doi: 10.1126/science.1173215 – ident: ref13 doi: 10.1073/pnas.1611835114 – ident: ref2 doi: 10.1016/j.neunet.2019.01.012 – year: 2017 ident: ref20 article-title: FearNet: Brain-inspired model for incremental learning publication-title: arXiv:1711.10563 – year: 1998 ident: ref42 publication-title: The MNIST Database of Handwritten Digits – ident: ref22 doi: 10.1109/CVPR.2019.01158 – ident: ref25 doi: 10.1007/978-3-030-01252-6_33 – ident: ref7 doi: 10.1002/hipo.20167 – ident: ref1 doi: 10.1016/s0079-7421(08)60536-8 – ident: ref27 doi: 10.1109/TNNLS.2019.2958324 – ident: ref10 doi: 10.1016/j.cell.2016.10.021 – ident: ref18 doi: 10.1109/CVPR.2017.587 – start-page: 2642 volume-title: Proc. 34th Int. Conf. Mach. Learn. ident: ref37 article-title: Conditional image synthesis with auxiliary classifier GANs – ident: ref19 doi: 10.1007/978-3-030-01258-8_15 – start-page: 5 volume-title: Proc. NIPS Workshop Deep Learn. Unsupervised Feature Learn. ident: ref43 article-title: Reading digits in natural images with unsupervised feature learning – ident: ref15 doi: 10.1007/978-3-030-01219-9_9 – ident: ref8 doi: 10.1037/0033-295X.102.3.419 – year: 2018 ident: ref17 article-title: Overcoming catastrophic forgetting with hard attention to the task publication-title: arXiv:1801.01423 – ident: ref34 doi: 10.7554/eLife.51005 – year: 2017 ident: ref38 article-title: Critical learning periods in deep neural networks publication-title: arXiv:1711.08856 – ident: ref16 doi: 10.4324/9781410612403 – volume: 70 start-page: 3987 volume-title: Proc. 34th Int. Conf. Mach. Learn. ident: ref14 article-title: Continual learning through synaptic intelligence – year: 2015 ident: ref46 article-title: Unsupervised representation learning with deep convolutional generative adversarial networks publication-title: arXiv:1511.06434 – ident: ref35 doi: 10.1038/nature08577 – ident: ref45 doi: 10.1007/s11263-015-0816-y – ident: ref30 doi: 10.3389/fnsys.2013.00074 – ident: ref36 doi: 10.1038/nature15257 |
SSID | ssj0000605649 |
Score | 2.5403204 |
Snippet | Continual acquisition of novel experience without interfering with previously learned knowledge, i.e., continual learning, is critical for artificial neural... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 1925 |
SubjectTerms | Algorithms Artificial neural networks Benchmarks Biological effects Biological neural networks Brain Brain modeling Brain-inspired algorithm catastrophic forgetting Cerebral cortex Classifiers Computer architecture continual learning deep learning Generative adversarial networks Hippocampus Information processing Knowledge Learning Life sciences Neocortex Neural networks Neural Networks, Computer Prefrontal cortex Synapses Task analysis Training data |
Title | Triple-Memory Networks: A Brain-Inspired Method for Continual Learning |
URI | https://ieeexplore.ieee.org/document/9540230 https://www.ncbi.nlm.nih.gov/pubmed/34529579 https://www.proquest.com/docview/2659343164 https://www.proquest.com/docview/2574404710 |
Volume | 33 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT9wwEB4Bp15KKbRNC5Ur9Va8xHacxNygAgFi98JW2lvk2E6FirKo7B7g13fGeQhVbdVblDivmbH9jT3zDcBnUVpFzFE8yDLlOPp5bhtvuHfWBRXyorCU4Dyd5RffsquFXmzA4ZgLE0KIwWdhQodxL98v3ZqWyo4MwguEzJuwiY5bl6s1rqekiMvziHalyCWXqlgMOTKpOZrPZtc36A1KgU4qnhHEFqpo01FTENezKSnWWPk73IzTzvk2TIcP7qJNfkzWq3rinn7jcvzfP3oFL3v8yU46g9mBjdC-hu2htgPru_ouEKXI_V3gU4rEfWSzLlr84ZidsFOqKsEvW9qjD55NYw1qhuCXEdXVLXGcsp629fsezM_P5l8veF9zgbtMFCscHGttSpEVyimDh0Y6T0tNFpGPDDqrEd6lQZPXY70rnap9gyr1jdO-dFq9ga122YZ3wJqs9EI0tsy0ykLR2FTUFmXfGK-MC00CYpB65Xo-ciqLcVdFvyQ1VVRaRUqreqUl8GW8575j4_hn612S-NiyF3YC-4Nyq77DPlQy10YRLUCWwKfxMnY12j-xbViusY2ObIqIyRJ42xnF-OzBlt7_-Z0f4IWkvIkYKbkPW6uf63CAaGZVf4xm_AtGnuxe |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VcoBLCxRoSgEjcYNsYztOYm4FUW1hkwuLtLfIsR2EqLJVu3tofz0zzkMIAeIWJc5rZmx_Y898A_CaF0YSc1TsRZHEOPq52LROx84a66XP8txQgnNZZfOv6aeVWu3A2ykXxnsfgs_8jA7DXr5b2y0tlZ1ohBcIme_AXUXJuH221rSikiAyzwLeFTwTsZD5asySSfTJsqoWX9AfFBzdVDzDiS9U0rajojCuXyalUGXl74AzTDxn-1COn9zHm_yYbTfNzN7-xub4v__0APYGBMpOe5N5CDu-ewT7Y3UHNnT2AyBSkcsLH5cUi3vDqj5e_PodO2Xvqa5EfN7RLr13rAxVqBnCX0ZkV9-J5ZQNxK3fHsPy7OPywzweqi7ENuX5BofHRumCp7m0UuOhFtbRYpNB7CO8ShsEeIlX5PcYZwsrG9eiUl1rlSuskk9gt1t3_hBYmxaO89YUqZKpz1uT8Mag7FvtpLa-jYCPUq_twEhOhTEu6uCZJLoOSqtJafWgtAjeTPdc9nwc_2x9QBKfWg7CjuB4VG49dNnrWmRKSyIGSCN4NV3GzkY7KKbz6y22UYFPEVFZBE97o5iePdrS0Z_f-RLuzZflol6cV5-fwX1BWRQhbvIYdjdXW_8csc2meRFM-ifC3e-m |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Triple-Memory+Networks%3A+A+Brain-Inspired+Method+for+Continual+Learning&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Wang%2C+Liyuan&rft.au=Lei%2C+Bo&rft.au=Li%2C+Qian&rft.au=Su%2C+Hang&rft.date=2022-05-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=2162-237X&rft.eissn=2162-2388&rft.volume=33&rft.issue=5&rft.spage=1925&rft_id=info:doi/10.1109%2FTNNLS.2021.3111019&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon |