Accurate online training of dynamical spiking neural networks through Forward Propagation Through Time
With recent advances in learning algorithms, recurrent networks of spiking neurons are achieving performance that is competitive with vanilla recurrent neural networks. However, these algorithms are limited to small networks of simple spiking neurons and modest-length temporal sequences, as they imp...
Saved in:
Published in | Nature machine intelligence Vol. 5; no. 5; pp. 518 - 527 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
01.05.2023
Nature Publishing Group |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | With recent advances in learning algorithms, recurrent networks of spiking neurons are achieving performance that is competitive with vanilla recurrent neural networks. However, these algorithms are limited to small networks of simple spiking neurons and modest-length temporal sequences, as they impose high memory requirements, have difficulty training complex neuron models and are incompatible with online learning. Here, we show how the recently developed Forward-Propagation Through Time (FPTT) learning combined with novel liquid time-constant spiking neurons resolves these limitations. Applying FPTT to networks of such complex spiking neurons, we demonstrate online learning of exceedingly long sequences while outperforming current online methods and approaching or outperforming offline methods on temporal classification tasks. The efficiency and robustness of FPTT enable us to directly train a deep and performant spiking neural network for joint object localization and recognition, demonstrating the ability to train large-scale dynamic and complex spiking neural network architectures.
Memory efficient online training of recurrent spiking neural networks without compromising accuracy is an open challenge in neuromorphic computing. Yin and colleagues demonstrate that training a recurrent neural network consisting of so-called liquid time-constant spiking neurons using an algorithm called Forward-Propagation Through Time allows for online learning and state-of-the-art performance at a reduced computational cost compared with existing approaches. |
---|---|
ISSN: | 2522-5839 2522-5839 |
DOI: | 10.1038/s42256-023-00650-4 |