On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices

Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing system with very low power consumption and massively parallel operation. To train the SNNs with supervision, we propose an efficient on-chip training scheme approximating bac...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in neuroscience Vol. 14; p. 423
Main Authors Kwon, Dongseok, Lim, Suhwan, Bae, Jong-Ho, Lee, Sung-Tae, Kim, Hyeongsu, Seo, Young-Tak, Oh, Seongbin, Kim, Jangsaeng, Yeom, Kyuho, Park, Byung-Gook, Lee, Jong-Ho
Format Journal Article
LanguageEnglish
Published Lausanne Frontiers Research Foundation 07.07.2020
Frontiers Media S.A
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing system with very low power consumption and massively parallel operation. To train the SNNs with supervision, we propose an efficient on-chip training scheme approximating backpropagation algorithm suitable for hardware implementation. We show that the accuracy of the proposed scheme for SNNs is close to that of conventional artificial neural networks, by using stochastic characteristics of neurons. In a hardware configuration, gated Schottky diodes (GSDs) are used as synaptic devices, which have a saturated current with respect to the input voltage. We design the SNN system by adopting the proposed on-chip training scheme with the GSDs which can update their conductance in parallel to speed up the overall system. The performance of on-chip training SNN system is validated through MNIST data set classification based on network size and total time step. The SNN systems achieve accuracy of 97.83% with 1 hidden layer, and 98.44% with 4 hidden layers in fully connected neural networks. We then evaluate the effect of nonlinearity and asymmetry of conductance response for long-term potentiation (LTP) and long-term depression (LTD) on the performance of the on-chip training SNN system. In addition, the impact of device variations on the performance of the on-chip training SNN system is evaluated.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
This article was submitted to Neuromorphic Engineering, a section of the journal Frontiers in Neuroscience
Edited by: Elisa Donati, ETH Zürich, Switzerland
Reviewed by: Michael Pfeiffer, Bosch Center for Artificial Intelligence, Germany; Shuangming Yang, Tianjin University, China
ISSN:1662-453X
1662-4548
1662-453X
DOI:10.3389/fnins.2020.00423