Fast and energy-efficient neuromorphic deep learning with first-spike times

For a biological agent operating under environmental pressure, energy consumption and reaction times are of critical importance. Similarly, engineered systems are optimized for short time-to-solution and low energy-to-solution characteristics. At the level of neuronal implementation, this implies ac...

Full description

Saved in:
Bibliographic Details
Published inNature machine intelligence Vol. 3; no. 9; pp. 823 - 835
Main Authors Göltz, J., Kriener, L., Baumbach, A., Billaudelle, S., Breitwieser, O., Cramer, B., Dold, D., Kungl, A. F., Senn, W., Schemmel, J., Meier, K., Petrovici, M. A.
Format Journal Article
LanguageEnglish
Published Basingstoke Nature Publishing Group 01.09.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For a biological agent operating under environmental pressure, energy consumption and reaction times are of critical importance. Similarly, engineered systems are optimized for short time-to-solution and low energy-to-solution characteristics. At the level of neuronal implementation, this implies achieving the desired results with as few and as early spikes as possible. With time-to-first-spike coding, both of these goals are inherently emerging features of learning. Here, we describe a rigorous derivation of a learning rule for such first-spike times in networks of leaky integrate-and-fire neurons, relying solely on input and output spike times, and show how this mechanism can implement error backpropagation in hierarchical spiking networks. Furthermore, we emulate our framework on the BrainScaleS-2 neuromorphic system and demonstrate its capability of harnessing the system’s speed and energy characteristics. Finally, we examine how our approach generalizes to other neuromorphic platforms by studying how its performance is affected by typical distortive effects induced by neuromorphic substrates.Spiking neural networks promise fast and energy-efficient information processing. The ‘time-to-first-spike’ coding scheme, where the time elapsed before a neuron’s first spike is utilized as the main variable, is a particularly efficient approach and Göltz and Kriener et al. demonstrate that error backpropagation, an essential ingredient for learning in neural networks, can be implemented in this scheme.
ISSN:2522-5839
2522-5839
DOI:10.1038/s42256-021-00388-x