Spike Counts Based Low Complexity SNN Architecture With Binary Synapse

In this paper, we present an energy and area efficient spike neural network (SNN) processor based on novel spike counts based methods. For the low cost SNN design, we propose hardware-friendly complexity reduction techniques for both of learning and inferencing modes of operations. First, for the un...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on biomedical circuits and systems Vol. 13; no. 6; pp. 1664 - 1677
Main Authors Tang, Hoyoung, Kim, Heetak, Kim, Hyeonseong, Park, Jongsun
Format Journal Article
LanguageEnglish
Published United States IEEE 01.12.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1932-4545
1940-9990
1940-9990
DOI10.1109/TBCAS.2019.2945406

Cover

Loading…
More Information
Summary:In this paper, we present an energy and area efficient spike neural network (SNN) processor based on novel spike counts based methods. For the low cost SNN design, we propose hardware-friendly complexity reduction techniques for both of learning and inferencing modes of operations. First, for the unsupervised learning process, we propose a spike counts based learning method. The novel learning approach utilizes pre- and post-synaptic spike counts to reduce the bit-width of synaptic weights as well as the number of weight updates. For the energy efficient inferencing operations, we propose an accumulation based computing scheme, where the number of input spikes for each input axon is accumulated without instant membrane updates until the pre-defined number of spikes are reached. In addition, the computation skip schemes identify meaningless computations and skip them to improve energy efficiency. Based on the proposed low complexity design techniques, we design and implement the SNN processor using 65 nm CMOS process. According to the implementation results, the SNN processor achieves 87.4% of recognition accuracy in MNIST dataset using only 1-bit 230 k synaptic weights with 400 excitatory neurons. The energy consumptions are 0.26 pJ/SOP and 0.31 μJ/inference in inferencing mode, and 1.42 pJ/SOP and 2.63 μJ/learning in learning mode of operations.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1932-4545
1940-9990
1940-9990
DOI:10.1109/TBCAS.2019.2945406