A Mixed-Signal Binarized Convolutional-Neural-Network Accelerator Integrating Dense Weight Storage and Multiplication for Reduced Data Movement

We present a 65nm CMOS mixed-signal accelerator for first and hidden layers of binarized CNNs. Hidden layers support up to 512, 3 ×3 ×512 binary - input filters, and first layers support up to 64, 3×3 ×3 analog-input filters. Weight storage and multiplication with input activations is achieved withi...

Full description

Saved in:
Bibliographic Details
Published in2018 IEEE Symposium on VLSI Circuits pp. 141 - 142
Main Authors Valavi, Hossein, Ramadge, Peter J., Nestler, Eric, Verma, Naveen
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2018
Subjects
Online AccessGet full text
DOI10.1109/VLSIC.2018.8502421

Cover

More Information
Summary:We present a 65nm CMOS mixed-signal accelerator for first and hidden layers of binarized CNNs. Hidden layers support up to 512, 3 ×3 ×512 binary - input filters, and first layers support up to 64, 3×3 ×3 analog-input filters. Weight storage and multiplication with input activations is achieved within compact hardware, only 1.8 × larger than a 6T SRAM bit cell, and output activations are computed via capacitive charge sharing, requiring distribution of only a switch-control signal. Reduced data movement gives energy-efficiency of 658 (binary) / 0.95 TOPS/W and throughput of 9438 (binary) / 10.64 GOPS for hidden / first layers.
DOI:10.1109/VLSIC.2018.8502421