Dynamic layer-span connecting spiking neural networks with backpropagation training

Spiking Neural Network (SNN) is one of the mainstream frameworks for brain-like computing and neuromorphic computing, which has the potential to overcome current AI challenges, for example, low-power learning dynamic processes. However, there is still a huge gap in performance between SNN and artifi...

Full description

Saved in:
Bibliographic Details
Published inComplex & intelligent systems Vol. 10; no. 2; pp. 1937 - 1952
Main Authors Wang, Zijjian, Huang, Yuxuan, Zhu, Yaqin, Xu, Binxing, Chen, Long
Format Journal Article
LanguageEnglish
Published Cham Springer International Publishing 01.04.2024
Springer Nature B.V
Springer
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Spiking Neural Network (SNN) is one of the mainstream frameworks for brain-like computing and neuromorphic computing, which has the potential to overcome current AI challenges, for example, low-power learning dynamic processes. However, there is still a huge gap in performance between SNN and artificial neural networks (ANN) in traditional supervised learning. One solution for this problem is to propose a better spiking neuron model to improve its memory ability for temporal data. This paper proposed a leaky integrate-and-fire (LIF) neuron model with dynamic postsynaptic potential and a layer-span connecting method for SNN trained using backpropagation. The dynamic postsynaptic potential LIF model allows the neurons dynamically release neurotransmitters in an SNN model, which mimics the activity of biological neurons. The layer-span connecting method enhances the long-distance memory ability of SNN. We also first introduced a cosh-based surrogate gradient for the backpropagation training of SNNs. We compared the SNN with cosh-based surrogate gradient (CSNN), CSNN with dynamic postsynaptic potential (Dyn-CSNN), layer-span connecting CSNN (Las-CSNN), and SNN model with all the proposed methods (DlaCSNN-BP) in three image classification and one text classification datasets. The experimental results exhibited that proposed SNN methods could outperform most of the previously proposed SNNs and ANNs in the same network structure. Among them, the proposed DlaCSNN-BP got the best classification performance. This result indicates that our proposed method can effectively improve the effect of SNN in supervised learning and reduce the gap with deep learning. This work also provides more possibilities for putting SNN into practical application.
ISSN:2199-4536
2198-6053
DOI:10.1007/s40747-023-01245-7