Continuous Online Sequence Learning with an Unsupervised Neural Network Model
The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning...
Saved in:
Published in | Neural computation Vol. 28; no. 11; pp. 2474 - 2504 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
One Rogers Street, Cambridge, MA 02142-1209, USA
MIT Press
01.11.2016
MIT Press Journals, The |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The ability to recognize and predict temporal sequences of sensory inputs is vital for
survival in natural environments. Based on many known properties of cortical neurons,
hierarchical temporal memory (HTM) sequence memory recently has been proposed as a
theoretical framework for sequence learning in the cortex. In this letter, we analyze
properties of HTM sequence memory and apply it to sequence learning and prediction
problems with streaming data. We show the model is able to continuously learn a large
number of variable order temporal sequences using an unsupervised Hebbian-like learning
rule. The sparse temporal codes formed by the model can robustly handle branching temporal
sequences by maintaining multiple predictions until there is sufficient disambiguating
evidence. We compare the HTM sequence memory with other sequence learning algorithms,
including statistical methods—autoregressive integrated moving average; feedforward neural
networks—time delay neural network and online sequential extreme learning machine; and
recurrent neural networks—long short-term memory and echo-state networks on sequence
prediction problems with both artificial and real-world data. The HTM model achieves
comparable accuracy to other state-of-the-art algorithms. The model also exhibits
properties that are critical for sequence learning, including continuous online learning,
the ability to handle multiple predictions and branching sequences with high-order
statistics, robustness to sensor noise and fault tolerance, and good performance without
task-specific hyperparameter tuning. Therefore, the HTM sequence memory not only advances
our understanding of how the brain may solve the sequence learning problem but is also
applicable to real-world sequence learning problems from continuous data streams. |
---|---|
AbstractList | The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning in the cortex. In this letter, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to continuously learn a large number of variable order temporal sequences using an unsupervised Hebbian-like learning rule. The sparse temporal codes formed by the model can robustly handle branching temporal sequences by maintaining multiple predictions until there is sufficient disambiguating evidence. We compare the HTM sequence memory with other sequence learning algorithms, including statistical methods-autoregressive integrated moving average; feedforward neural networks-time delay neural network and online sequential extreme learning machine; and recurrent neural networks-long short-term memory and echo-state networks on sequence prediction problems with both artificial and real-world data. The HTM model achieves comparable accuracy to other state-of-the-art algorithms. The model also exhibits properties that are critical for sequence learning, including continuous online learning, the ability to handle multiple predictions and branching sequences with high-order statistics, robustness to sensor noise and fault tolerance, and good performance without task-specific hyperparameter tuning. Therefore, the HTM sequence memory not only advances our understanding of how the brain may solve the sequence learning problem but is also applicable to real-world sequence learning problems from continuous data streams. The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning in the cortex. In this letter, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to continuously learn a large number of variable order temporal sequences using an unsupervised Hebbian-like learning rule. The sparse temporal codes formed by the model can robustly handle branching temporal sequences by maintaining multiple predictions until there is sufficient disambiguating evidence. The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning in the cortex. In this letter, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to continuously learn a large number of variable order temporal sequences using an unsupervised Hebbian-like learning rule. The sparse temporal codes formed by the model can robustly handle branching temporal sequences by maintaining multiple predictions until there is sufficient disambiguating evidence. We compare the HTM sequence memory with other sequence learning algorithms, including statistical methods—autoregressive integrated moving average; feedforward neural networks—time delay neural network and online sequential extreme learning machine; and recurrent neural networks—long short-term memory and echo-state networks on sequence prediction problems with both artificial and real-world data. The HTM model achieves comparable accuracy to other state-of-the-art algorithms. The model also exhibits properties that are critical for sequence learning, including continuous online learning, the ability to handle multiple predictions and branching sequences with high-order statistics, robustness to sensor noise and fault tolerance, and good performance without task-specific hyperparameter tuning. Therefore, the HTM sequence memory not only advances our understanding of how the brain may solve the sequence learning problem but is also applicable to real-world sequence learning problems from continuous data streams. Abstract The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning in the cortex. In this letter, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to continuously learn a large number of variable order temporal sequences using an unsupervised Hebbian-like learning rule. The sparse temporal codes formed by the model can robustly handle branching temporal sequences by maintaining multiple predictions until there is sufficient disambiguating evidence. We compare the HTM sequence memory with other sequence learning algorithms, including statistical methods—autoregressive integrated moving average; feedforward neural networks—time delay neural network and online sequential extreme learning machine; and recurrent neural networks—long short-term memory and echo-state networks on sequence prediction problems with both artificial and real-world data. The HTM model achieves comparable accuracy to other state-of-the-art algorithms. The model also exhibits properties that are critical for sequence learning, including continuous online learning, the ability to handle multiple predictions and branching sequences with high-order statistics, robustness to sensor noise and fault tolerance, and good performance without task-specific hyperparameter tuning. Therefore, the HTM sequence memory not only advances our understanding of how the brain may solve the sequence learning problem but is also applicable to real-world sequence learning problems from continuous data streams. |
Author | Ahmad, Subutai Cui, Yuwei Hawkins, Jeff |
Author_xml | – sequence: 1 givenname: Yuwei surname: Cui fullname: Cui, Yuwei email: ycui@numenta.com organization: Numenta, Inc. Redwood City, CA 94063, U.S.A. ycui@numenta.com – sequence: 2 givenname: Subutai surname: Ahmad fullname: Ahmad, Subutai email: sahmad@numenta.com organization: Numenta, Inc. Redwood City, CA 94063, U.S.A. sahmad@numenta.com – sequence: 3 givenname: Jeff surname: Hawkins fullname: Hawkins, Jeff email: jhawkins@numenta.com organization: Numenta, Inc. Redwood City, CA 94063, U.S.A. jhawkins@numenta.com |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/27626963$$D View this record in MEDLINE/PubMed |
BookMark | eNqNkT1vFDEQhi0URC6BjhqtREPBwvjbLqNT-JAuuQIi0Vl73llw2LMPezcR_HpMEkKEUqQaafTomXn1HpC9mCIS8pzCG0oVe3t6vFy7zgEYyx-RBZUcWmPMlz2yqCvbaqX0Pjko5RwAFAX5hOwzrZiyii_IyTLFKcQ5zaVZxzFEbD7hjxmjx2aFXY4hfm0uw_St6WJzFsu8w3wRCvbNKc65G-uYLlP-3pykHsen5PHQjQWf3cxDcvbu-PPyQ7tav_-4PFq1Xko9taynQvRysEorLwRAz3XvO8DeS8GF2PgBtNx46Y2XXDC0QlkqhcANSCsGfkheXXt3OdVny-S2oXgcxy5iTeKoUUJyo4A_AJVWU00NfQgK2nBOTUVf_oeepznHmrlSXEuu-JXw9TXlcyol4-B2OWy7_NNRcH_Kc3fLq_iLG-m82WJ_C_9t69-D23D34P2uo3vQiD5dMBModTWFBe0YMOZAOuDuV9hdAbeO3zUStno |
CODEN | NEUCEB |
CitedBy_id | crossref_primary_10_1038_s41467_021_22559_5 crossref_primary_10_1088_1748_3190_aae1c2 crossref_primary_10_1016_j_neucom_2020_04_077 crossref_primary_10_3389_fnins_2021_650430 crossref_primary_10_3390_ma12132122 crossref_primary_10_1109_ACCESS_2021_3077004 crossref_primary_10_1109_TCYB_2019_2935066 crossref_primary_10_1017_S0263574720000521 crossref_primary_10_3390_a13070165 crossref_primary_10_56583_br_2191 crossref_primary_10_1002_stc_2404 crossref_primary_10_1145_3447777 crossref_primary_10_1093_comjnl_bxae004 crossref_primary_10_1016_j_future_2018_05_014 crossref_primary_10_1016_j_jocs_2022_101693 crossref_primary_10_1038_s42256_022_00452_0 crossref_primary_10_1080_0952813X_2021_1964614 crossref_primary_10_3390_electronics10141630 crossref_primary_10_1016_j_neunet_2023_09_005 crossref_primary_10_3390_electronics12153229 crossref_primary_10_5406_19398298_135_4_14 crossref_primary_10_1109_ACCESS_2020_3030031 crossref_primary_10_1016_j_eswa_2020_113668 crossref_primary_10_1016_j_neuroscience_2022_03_008 crossref_primary_10_1051_epjconf_202024507061 crossref_primary_10_32604_cmc_2022_019847 crossref_primary_10_1371_journal_pone_0217966 crossref_primary_10_1162_neco_a_00988 crossref_primary_10_1016_j_conb_2019_07_006 crossref_primary_10_1109_OJITS_2023_3347484 crossref_primary_10_1080_23746149_2021_1894234 crossref_primary_10_2299_jsp_27_69 crossref_primary_10_46300_9106_2020_14_110 crossref_primary_10_1016_j_neucom_2019_01_055 crossref_primary_10_1155_2022_3119284 crossref_primary_10_1016_j_patrec_2019_09_022 crossref_primary_10_1016_j_sigpro_2023_109090 crossref_primary_10_20965_jaciii_2021_p0450 crossref_primary_10_1016_j_asoc_2018_03_013 crossref_primary_10_1109_TNNLS_2022_3213688 crossref_primary_10_1016_j_mlwa_2023_100470 crossref_primary_10_1016_j_eswa_2020_113836 crossref_primary_10_1186_s40708_022_00156_6 crossref_primary_10_3390_s20164579 crossref_primary_10_1155_2021_6691010 crossref_primary_10_1155_2022_6072316 crossref_primary_10_1007_s42452_021_04715_0 crossref_primary_10_1007_s42979_024_02647_1 crossref_primary_10_1109_ACCESS_2018_2799210 crossref_primary_10_1049_iet_cps_2017_0053 crossref_primary_10_1016_j_future_2023_09_028 crossref_primary_10_1109_TNNLS_2019_2927106 crossref_primary_10_32604_cmes_2021_015673 crossref_primary_10_1109_ACCESS_2020_3001296 crossref_primary_10_2139_ssrn_3306245 crossref_primary_10_1109_ACCESS_2019_2942937 crossref_primary_10_1109_JIOT_2019_2958185 crossref_primary_10_1007_s10845_020_01583_0 crossref_primary_10_1016_j_aci_2018_09_002 crossref_primary_10_1007_s12652_018_1058_y crossref_primary_10_1063_5_0084468 crossref_primary_10_1007_s12559_024_10265_5 crossref_primary_10_3103_S0146411623080114 crossref_primary_10_3389_frobt_2016_00081 crossref_primary_10_1016_j_neunet_2021_07_021 crossref_primary_10_1155_2021_5585238 crossref_primary_10_3390_s24041143 crossref_primary_10_3389_fncir_2020_00012 crossref_primary_10_3390_ma12060875 crossref_primary_10_2139_ssrn_3306250 crossref_primary_10_1093_bioinformatics_btz967 crossref_primary_10_1016_j_jpdc_2019_08_009 crossref_primary_10_1088_2634_4386_acca45 crossref_primary_10_1109_TNNLS_2019_2921143 crossref_primary_10_1109_TDSC_2020_3037054 crossref_primary_10_3389_fncom_2017_00111 crossref_primary_10_1098_rstb_2019_0304 crossref_primary_10_1109_TC_2020_3000183 crossref_primary_10_3233_HIS_230013 crossref_primary_10_1007_s10694_020_01055_0 crossref_primary_10_1007_s42979_022_01066_4 crossref_primary_10_1016_j_cub_2020_10_059 crossref_primary_10_1016_j_neucom_2017_04_070 crossref_primary_10_1016_j_neunet_2023_07_040 crossref_primary_10_1109_TSMC_2022_3150398 crossref_primary_10_1109_TCAD_2017_2748024 crossref_primary_10_1155_2022_3973665 crossref_primary_10_1109_TSMC_2017_2757029 crossref_primary_10_3390_jmse12040574 crossref_primary_10_7554_eLife_38471 crossref_primary_10_1371_journal_pcbi_1010233 crossref_primary_10_1016_j_neucom_2018_09_098 crossref_primary_10_1016_j_cogsys_2020_04_003 crossref_primary_10_1109_TETCI_2018_2838124 crossref_primary_10_3390_biomimetics9030175 crossref_primary_10_1016_j_cose_2019_101599 crossref_primary_10_1109_TSMC_2020_3035612 crossref_primary_10_3389_fncom_2018_00050 crossref_primary_10_1016_j_cogr_2021_10_001 crossref_primary_10_1007_s11063_024_11546_8 crossref_primary_10_1016_j_neubiorev_2023_105402 |
Cites_doi | 10.1038/380526a0 10.1109/29.21701 10.1038/nature12600 10.1162/neco.1990.2.4.490 10.1109/MASSP.1986.1165342 10.1371/journal.pbio.1000260 10.1002/jnr.22444 10.1016/S0896-6273(02)00903-0 10.1146/annurev.neuro.27.070203.144247 10.1093/cercor/10.12.1155 10.1038/nn.3036 10.1007/978-3-642-81708-3 10.1016/j.neucom.2005.12.126 10.1145/347090.347107 10.1007/s13042-011-0019-y 10.1109/TITS.2013.2262376 10.1016/S1364-6613(98)01202-9 10.3389/frobt.2016.00081 10.1523/JNEUROSCI.4098-12.2013 10.1016/S0925-2312(01)00700-7 10.1162/neco.2009.11-08-901 10.1016/j.eswa.2012.01.039 10.1016/j.tins.2007.02.005 10.1093/acprof:oso/9780199641178.001.0001 10.1162/neco.1989.1.2.270 10.1109/TNN.2006.880583 10.1016/j.neucom.2014.05.068 10.1023/A:1007469218079 10.1109/NANO.2011.6144380 10.1038/nn1253 10.1126/science.1091277 10.1162/0899766053723096 10.1038/nrn2286 10.1038/35009043 10.1007/978-3-642-76153-9_28 10.1038/nature14539 10.1007/978-1-4419-8020-5 10.1109/ICMLA.2015.141 10.1093/brain/awf110 10.1145/1083784.1083789 10.1201/EBK1439826119 10.18637/jss.v027.i03 10.1007/978-3-642-02565-5_4 10.1016/j.conb.2004.07.007 10.1016/j.neunet.2014.09.003 10.1146/annurev-neuro-062111-150343 10.1093/brain/120.4.701 10.1162/neco.1997.9.8.1735 10.1016/j.asoc.2010.07.003 10.1038/nn.3683 10.1109/ICASSP.2014.6854560 10.3389/fncir.2016.00023 10.1002/0470846674.ch16 10.1016/j.ijforecast.2011.04.001 10.1371/journal.pcbi.1003143 10.1007/3-540-70659-3_2 10.1007/978-3-642-24797-2 |
ContentType | Journal Article |
Copyright | Copyright MIT Press Journals Nov 2016 |
Copyright_xml | – notice: Copyright MIT Press Journals Nov 2016 |
DBID | NPM AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D 7QO FR3 P64 7X8 |
DOI | 10.1162/NECO_a_00893 |
DatabaseName | PubMed CrossRef Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Biotechnology Research Abstracts Engineering Research Database Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitle | PubMed CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional Engineering Research Database Biotechnology Research Abstracts Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitleList | Computer and Information Systems Abstracts Computer and Information Systems Abstracts MEDLINE - Academic Engineering Research Database CrossRef PubMed |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 1530-888X |
EndPage | 2504 |
ExternalDocumentID | 4244114531 10_1162_NECO_a_00893 27626963 neco_a_00893.pdf |
Genre | Journal Article Feature |
GroupedDBID | - 0R 123 4.4 4S 6IK AAJGR AAPBV ABDBF ABFLS ABIVO ABPTK ADIYS AEILP AENEX AFHIN ALMA_UNASSIGNED_HOLDINGS ARCSS AVWKF AZFZN BEFXN BFFAM BGNUA BKEBE BPEOZ CAG CS3 DU5 EAP EAS EBC EBD EBS ECS EDO EJD EMB EMK EPL EPS EST ESX F5P FEDTE FNEHJ HZ I-F IPLJI JAVBF MCG MKJ O9- OCL P2P PK0 PQEST PQQKQ RMI SV3 TUS WH7 X --- -~X .4S .DC 0R~ 36B 41~ 53G AALMD AAYOK ABDNZ ABEFU ABJNI ACGFO ACYGS AEGXH AIAGR COF EMOBN HVGLF HZ~ H~9 MINIK NPM WG8 XJE ZWS AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D 7QO FR3 P64 7X8 |
ID | FETCH-LOGICAL-c557t-2d144d5f9676c4400d37dca0edc54344bcf075bc5c8c5342e94691544eb0594f3 |
ISSN | 0899-7667 |
IngestDate | Fri Oct 25 04:21:02 EDT 2024 Fri Oct 25 04:00:25 EDT 2024 Fri Oct 25 22:53:37 EDT 2024 Thu Oct 10 17:14:31 EDT 2024 Thu Sep 12 16:27:12 EDT 2024 Sat Sep 28 08:37:17 EDT 2024 Wed May 04 12:10:52 EDT 2022 Thu May 05 15:20:17 EDT 2022 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 11 |
Language | English |
LinkModel | OpenURL |
MergedId | FETCHMERGED-LOGICAL-c557t-2d144d5f9676c4400d37dca0edc54344bcf075bc5c8c5342e94691544eb0594f3 |
Notes | November, 2016 ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
OpenAccessLink | https://direct.mit.edu/neco/article/doi/10.1162/NECO_a_00893 |
PMID | 27626963 |
PQID | 1837536381 |
PQPubID | 37252 |
PageCount | 31 |
ParticipantIDs | mit_journals_necov28i11_318907_2022_05_03_zip_neco_a_00893 proquest_journals_1837536381 proquest_miscellaneous_1864538603 proquest_miscellaneous_1859717181 mit_journals_10_1162_NECO_a_00893 proquest_miscellaneous_1850783318 crossref_primary_10_1162_NECO_a_00893 pubmed_primary_27626963 |
PublicationCentury | 2000 |
PublicationDate | 2016-11-01 2016-Nov 20161101 |
PublicationDateYYYYMMDD | 2016-11-01 |
PublicationDate_xml | – month: 11 year: 2016 text: 2016-11-01 day: 01 |
PublicationDecade | 2010 |
PublicationPlace | One Rogers Street, Cambridge, MA 02142-1209, USA |
PublicationPlace_xml | – name: One Rogers Street, Cambridge, MA 02142-1209, USA – name: United States – name: Cambridge |
PublicationTitle | Neural computation |
PublicationTitleAlternate | Neural Comput |
PublicationYear | 2016 |
Publisher | MIT Press MIT Press Journals, The |
Publisher_xml | – name: MIT Press – name: MIT Press Journals, The |
References | 2022050215131500500_R11 2022050215131500500_R55 2022050215131500500_R10 2022050215131500500_R54 2022050215131500500_R13 2022050215131500500_R57 2022050215131500500_R12 2022050215131500500_R56 2022050215131500500_R15 2022050215131500500_R59 2022050215131500500_R14 2022050215131500500_R58 2022050215131500500_R17 2022050215131500500_R16 2022050215131500500_R51 2022050215131500500_R50 2022050215131500500_R53 2022050215131500500_R52 2022050215131500500_R19 2022050215131500500_R18 2022050215131500500_R22 2022050215131500500_R66 2022050215131500500_R21 2022050215131500500_R65 2022050215131500500_R24 2022050215131500500_R68 2022050215131500500_R23 2022050215131500500_R67 2022050215131500500_R26 2022050215131500500_R25 2022050215131500500_R69 2022050215131500500_R28 2022050215131500500_R27 2022050215131500500_R60 2022050215131500500_R62 2022050215131500500_R61 2022050215131500500_R20 2022050215131500500_R64 2022050215131500500_R63 2022050215131500500_R8 2022050215131500500_R7 2022050215131500500_R29 2022050215131500500_R6 2022050215131500500_R5 2022050215131500500_R4 2022050215131500500_R3 2022050215131500500_R2 2022050215131500500_R1 2022050215131500500_R33 2022050215131500500_R32 2022050215131500500_R35 2022050215131500500_R34 2022050215131500500_R37 2022050215131500500_R36 2022050215131500500_R39 2022050215131500500_R38 2022050215131500500_R71 2022050215131500500_R70 2022050215131500500_R73 2022050215131500500_R72 2022050215131500500_R31 2022050215131500500_R30 2022050215131500500_R74 2022050215131500500_R44 2022050215131500500_R43 2022050215131500500_R46 2022050215131500500_R45 2022050215131500500_R48 2022050215131500500_R47 2022050215131500500_R9 2022050215131500500_R49 2022050215131500500_R40 2022050215131500500_R42 2022050215131500500_R41 |
References_xml | – ident: 2022050215131500500_R57 doi: 10.1038/380526a0 – ident: 2022050215131500500_R69 doi: 10.1109/29.21701 – ident: 2022050215131500500_R16 – ident: 2022050215131500500_R64 doi: 10.1038/nature12600 – ident: 2022050215131500500_R71 doi: 10.1162/neco.1990.2.4.490 – ident: 2022050215131500500_R55 doi: 10.1109/MASSP.1986.1165342 – ident: 2022050215131500500_R50 doi: 10.1371/journal.pbio.1000260 – ident: 2022050215131500500_R3 doi: 10.1002/jnr.22444 – ident: 2022050215131500500_R35 – ident: 2022050215131500500_R74 doi: 10.1016/S0896-6273(02)00903-0 – ident: 2022050215131500500_R44 doi: 10.1146/annurev.neuro.27.070203.144247 – ident: 2022050215131500500_R54 – ident: 2022050215131500500_R8 doi: 10.1093/cercor/10.12.1155 – ident: 2022050215131500500_R25 – ident: 2022050215131500500_R21 – ident: 2022050215131500500_R73 doi: 10.1038/nn.3036 – ident: 2022050215131500500_R1 doi: 10.1007/978-3-642-81708-3 – ident: 2022050215131500500_R29 doi: 10.1016/j.neucom.2005.12.126 – ident: 2022050215131500500_R13 doi: 10.1145/347090.347107 – ident: 2022050215131500500_R28 doi: 10.1007/s13042-011-0019-y – ident: 2022050215131500500_R48 doi: 10.1109/TITS.2013.2262376 – ident: 2022050215131500500_R46 – ident: 2022050215131500500_R10 doi: 10.1016/S1364-6613(98)01202-9 – ident: 2022050215131500500_R47 doi: 10.3389/frobt.2016.00081 – ident: 2022050215131500500_R6 doi: 10.1523/JNEUROSCI.4098-12.2013 – ident: 2022050215131500500_R32 doi: 10.1016/S0925-2312(01)00700-7 – ident: 2022050215131500500_R53 doi: 10.1162/neco.2009.11-08-901 – ident: 2022050215131500500_R4 doi: 10.1016/j.eswa.2012.01.039 – ident: 2022050215131500500_R43 doi: 10.1016/j.tins.2007.02.005 – ident: 2022050215131500500_R14 doi: 10.1093/acprof:oso/9780199641178.001.0001 – ident: 2022050215131500500_R72 doi: 10.1162/neco.1989.1.2.270 – ident: 2022050215131500500_R39 doi: 10.1109/TNN.2006.880583 – ident: 2022050215131500500_R70 doi: 10.1016/j.neucom.2014.05.068 – ident: 2022050215131500500_R15 doi: 10.1023/A:1007469218079 – ident: 2022050215131500500_R68 doi: 10.1109/NANO.2011.6144380 – ident: 2022050215131500500_R26 – ident: 2022050215131500500_R52 doi: 10.1038/nn1253 – ident: 2022050215131500500_R2 – ident: 2022050215131500500_R34 doi: 10.1126/science.1091277 – ident: 2022050215131500500_R67 doi: 10.1162/0899766053723096 – ident: 2022050215131500500_R65 doi: 10.1038/nrn2286 – ident: 2022050215131500500_R63 doi: 10.1038/35009043 – ident: 2022050215131500500_R66 – ident: 2022050215131500500_R7 doi: 10.1007/978-3-642-76153-9_28 – ident: 2022050215131500500_R37 doi: 10.1038/nature14539 – ident: 2022050215131500500_R58 doi: 10.1007/978-1-4419-8020-5 – ident: 2022050215131500500_R36 doi: 10.1109/ICMLA.2015.141 – ident: 2022050215131500500_R9 doi: 10.1093/brain/awf110 – ident: 2022050215131500500_R33 – ident: 2022050215131500500_R17 doi: 10.1145/1083784.1083789 – ident: 2022050215131500500_R18 doi: 10.1201/EBK1439826119 – ident: 2022050215131500500_R31 doi: 10.18637/jss.v027.i03 – ident: 2022050215131500500_R23 – ident: 2022050215131500500_R60 doi: 10.1007/978-3-642-02565-5_4 – ident: 2022050215131500500_R51 doi: 10.1016/j.conb.2004.07.007 – ident: 2022050215131500500_R40 – ident: 2022050215131500500_R5 – ident: 2022050215131500500_R61 doi: 10.1016/j.neunet.2014.09.003 – ident: 2022050215131500500_R42 doi: 10.1146/annurev-neuro-062111-150343 – ident: 2022050215131500500_R49 doi: 10.1093/brain/120.4.701 – ident: 2022050215131500500_R27 doi: 10.1162/neco.1997.9.8.1735 – ident: 2022050215131500500_R41 doi: 10.1016/j.asoc.2010.07.003 – ident: 2022050215131500500_R19 doi: 10.1038/nn.3683 – ident: 2022050215131500500_R38 doi: 10.1109/ICASSP.2014.6854560 – ident: 2022050215131500500_R30 – ident: 2022050215131500500_R22 doi: 10.3389/fncir.2016.00023 – ident: 2022050215131500500_R56 doi: 10.1002/0470846674.ch16 – ident: 2022050215131500500_R59 – ident: 2022050215131500500_R11 doi: 10.1016/j.ijforecast.2011.04.001 – ident: 2022050215131500500_R62 – ident: 2022050215131500500_R24 – ident: 2022050215131500500_R45 doi: 10.1371/journal.pcbi.1003143 – ident: 2022050215131500500_R12 doi: 10.1007/3-540-70659-3_2 – ident: 2022050215131500500_R20 doi: 10.1007/978-3-642-24797-2 |
SSID | ssj0006105 |
Score | 2.6135876 |
Snippet | The ability to recognize and predict temporal sequences of sensory inputs is vital for
survival in natural environments. Based on many known properties of... The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of... Abstract The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known... |
SourceID | proquest crossref pubmed mit |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 2474 |
SubjectTerms | Algorithms Distance learning Handles Learning Letters Mathematical models Memory Neural networks Neurons Online instruction Recognition Statistical methods Survival |
Title | Continuous Online Sequence Learning with an Unsupervised Neural Network Model |
URI | https://direct.mit.edu/neco/article/doi/10.1162/NECO_a_00893 https://www.ncbi.nlm.nih.gov/pubmed/27626963 https://www.proquest.com/docview/1837536381 https://search.proquest.com/docview/1850783318 https://search.proquest.com/docview/1859717181 https://search.proquest.com/docview/1864538603 |
Volume | 28 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1bb9MwFLbK9sIL98tgICPBY0bqJE7C21ZaDbR1L61UnqzEcUYEyyraMGm_ns9xnLqCTcBLVMVu7Ph8-XJOfC6EvNVCZSxnHtRl3wuDvPSSUEWeTLOSlUlYxJn-Dnk65cfz8PMiWgwGpeO11KzzA3n9x7iS_5EqzkGuOkr2HyTbXxQn8BvyxRESxvGvZKxTS1V1o71YTcpQPPrGM9rmTT23wWtQLlfNUhPDCiqmTskB2UyND3hbEO27q6Z27bIt-bC1Vz9q2u3_L82VqnqwfL0wQAELoXu1YbWrb93nbL2P7H5gGPIu0q53Mvo0M-4gdoO8c19SLlOlqRdzU1fjQFkm9T2Y1wuXalniQmroEmdoivX8zuhcZ4idjkdnIhNQWEw9xe3E2dMzMZmfnIjZeDHbbm1f1DqWD2ZfpMPtdxkICUy4e3j08WjSv7N55-xqb8SGSHD23h16S3m5c1Gtb7ZLWv1k9oDc61aNHhqUPCQDVT8i923RDtpx-GNyugENNaChFjTUgoZq0NCspi5oqAEF7UBDW9A8IfPJeDY69rqaGp6MonjtsQIWdBGVKY-5DEHgRRAXMvNVIXWQcZjLEkpkLiOZyCgImUpDnuqMTSrXmX3K4CnZqS9r9ZxQP1OwdTlYIMM10yhjahgXrMyDwi_LLN0j7-xiiaVJnSJak5Mz4S7qHnmDlRTdc7W6oc-HrT61kpc_WVLBesVLKfVjwaCMCj8SfiCuq2XbYfPnfSskZ5Qk0LOHworx-2Zwq94wy2oFKaBPpHe5McKtfdJ4CA3v1utwoA9kiKk8MyDpV4RBG-EgzBe3T_Ilubt5NvfJzvpHo15BHV7nrzss_wLaDrPZ |
link.rule.ids | 315,783,787,27936,27937 |
linkProvider | EBSCOhost |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Continuous+Online+Sequence+Learning+with+an+Unsupervised+Neural+Network+Model&rft.jtitle=Neural+computation&rft.au=Cui%2C+Yuwei&rft.au=Ahmad%2C+Subutai&rft.au=Hawkins%2C+Jeff&rft.date=2016-11-01&rft.pub=MIT+Press+Journals%2C+The&rft.issn=0899-7667&rft.eissn=1530-888X&rft.volume=28&rft.issue=11&rft.spage=2474&rft_id=info:doi/10.1162%2FNECO_a_00893&rft.externalDBID=NO_FULL_TEXT&rft.externalDocID=4244114531 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0899-7667&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0899-7667&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0899-7667&client=summon |