A Low-Bitwidth Integer-STBP Algorithm for Efficient Training and Inference of Spiking Neural Networks

Spiking neural networks (SNNs) that enable energy-efficient neuromorphic hardware are receiving growing attention. Training SNNs directly with back-propagation has demonstrated accuracy comparable to deep neural networks (DNNs). However, previous direct-training algorithms require high-precision flo...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the 28th Asia and South Pacific Design Automation Conference pp. 651 - 656
Main Authors Tan, Pai-Yu, Wu, Cheng-Wen
Format Conference Proceeding
LanguageEnglish
Published New York, NY, USA ACM 16.01.2023
SeriesACM Conferences
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Spiking neural networks (SNNs) that enable energy-efficient neuromorphic hardware are receiving growing attention. Training SNNs directly with back-propagation has demonstrated accuracy comparable to deep neural networks (DNNs). However, previous direct-training algorithms require high-precision floating-point operations, which are not suitable for low-power end-point devices. The high-precision operations also require the learning algorithm to run on high-performance accelerator hardware. In this paper, we propose an improved approach that converts the high-precision floating-point operations to low-bitwidth integer operations for an existing direct-training algorithm, i.e., the Spatio-Temporal Back-Propagation (STBP) algorithm. The proposed low-bitwidth Integer-STBP algorithm requires only integer arithmetic for SNN training and inference, which greatly reduces the computational complexity. Experimental results show that the proposed STBP algorithm achieves comparable accuracy and higher energy efficiency than the original floating-point STBP algorithm. Moreover, it can be implemented on low-power end-point devices to provide learning capability during inference, which are mostly supported by fixed-point hardware.
ISBN:9781450397834
1450397832
DOI:10.1145/3566097.3567875