Minimizing Off-Chip Memory Access for CNN Accelerators

Convolution neural network (CNN) accelerators are commonly used to boost the CNN application's performance. The energy efficiency of the CNN accelerators is of paramount importance for battery-operated devices like smartphones. A substantial fraction of their energy consumption is due to off-ch...

Full description

Saved in:
Bibliographic Details
Published inIEEE consumer electronics magazine Vol. 11; no. 3; pp. 95 - 104
Main Authors Tewari, Saurabh, Kumar, Anshul, Paul, Kolin
Format Magazine Article
LanguageEnglish
Published Piscataway IEEE 01.05.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Convolution neural network (CNN) accelerators are commonly used to boost the CNN application's performance. The energy efficiency of the CNN accelerators is of paramount importance for battery-operated devices like smartphones. A substantial fraction of their energy consumption is due to off-chip memory accesses. These accelerators connect to the off-chip memory by a wide bus to improve the throughput. However, accessing the data from an unaligned address or size that is not a multiple of bus width leads to low bus width utilization and wastage of energy. Memory accesses can be reduced considerably by partitioning the data in a way that increases the number of aligned accesses and optimally utilizes bus width. We propose an approach that factors in the architectural parameters to evaluate the memory access. Our tool determines optimal partitioning and data reuse scheme for convolution and fully connected layers to minimize the off-chip memory accesses for these accelerators. Compared to state-of-the-art, our approach reduces off-chip memory accesses of AlexNet, VGG16, and ResNet:50 by 9%, 16%, and 28% on 64 bits data bus and by 16%, 29%, and 46% on 128 bits data bus, respectively.
Bibliography:ObjectType-Article-1
ObjectType-Feature-2
content type line 24
SourceType-Magazines-1
ISSN:2162-2248
2162-2256
DOI:10.1109/MCE.2021.3097697