CASA: A Convolution Accelerator using Skip Algorithm for Deep Neural Network

This paper proposes an accelerator that performs efficient convolution operations in the inference phase of a convolutional neural network (CNN). Convolution operation comprises the multiplication and accumulation (MAC) operation of weight and feature map data. If the neurons have zero as inputs, th...

Full description

Saved in:
Bibliographic Details
Published in2019 IEEE International Symposium on Circuits and Systems (ISCAS) pp. 1 - 5
Main Authors Kim, Young Ho, An, Gi Jo, Sunwoo, Myung Hoon
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper proposes an accelerator that performs efficient convolution operations in the inference phase of a convolutional neural network (CNN). Convolution operation comprises the multiplication and accumulation (MAC) operation of weight and feature map data. If the neurons have zero as inputs, the neurons will not have a large effect on the performance of the neural network. Therefore, if convolution operations can be skipped by examining whether the neurons have zero value, the computational complexity can be greatly reduced. The proposed algorithm skips a convolution operation when the number of zeros in the values of input feature map is greater than or equal to the threshold value of 7 in the 3 × 3 filter size convolution operation. The threshold value was determined by considering the number of convolution operations skipped and the accuracy. The proposed convolution accelerator was composed at an operating frequency of 400 MHz using 65-nm technology. The performance of the proposed accelerator was 207 Giga Operation per second (GOp/s), which was 180% higher compared to when not skipped. Furthermore, the energy efficiency of 473 GOp/s/W was achieved in the core area of 1.2 mega gate equivalents (MGE).
ISBN:9781728103976
1728103975
ISSN:2158-1525
2158-1525
DOI:10.1109/ISCAS.2019.8702307