Continuous Online Sequence Learning with an Unsupervised Neural Network Model
The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning...
Saved in:
Published in | Neural computation Vol. 28; no. 11; pp. 2474 - 2504 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
One Rogers Street, Cambridge, MA 02142-1209, USA
MIT Press
01.11.2016
MIT Press Journals, The |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The ability to recognize and predict temporal sequences of sensory inputs is vital for
survival in natural environments. Based on many known properties of cortical neurons,
hierarchical temporal memory (HTM) sequence memory recently has been proposed as a
theoretical framework for sequence learning in the cortex. In this letter, we analyze
properties of HTM sequence memory and apply it to sequence learning and prediction
problems with streaming data. We show the model is able to continuously learn a large
number of variable order temporal sequences using an unsupervised Hebbian-like learning
rule. The sparse temporal codes formed by the model can robustly handle branching temporal
sequences by maintaining multiple predictions until there is sufficient disambiguating
evidence. We compare the HTM sequence memory with other sequence learning algorithms,
including statistical methods—autoregressive integrated moving average; feedforward neural
networks—time delay neural network and online sequential extreme learning machine; and
recurrent neural networks—long short-term memory and echo-state networks on sequence
prediction problems with both artificial and real-world data. The HTM model achieves
comparable accuracy to other state-of-the-art algorithms. The model also exhibits
properties that are critical for sequence learning, including continuous online learning,
the ability to handle multiple predictions and branching sequences with high-order
statistics, robustness to sensor noise and fault tolerance, and good performance without
task-specific hyperparameter tuning. Therefore, the HTM sequence memory not only advances
our understanding of how the brain may solve the sequence learning problem but is also
applicable to real-world sequence learning problems from continuous data streams. |
---|---|
Bibliography: | November, 2016 ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0899-7667 1530-888X |
DOI: | 10.1162/NECO_a_00893 |