A Mixed-Signal Binarized Convolutional-Neural-Network Accelerator Integrating Dense Weight Storage and Multiplication for Reduced Data Movement
We present a 65nm CMOS mixed-signal accelerator for first and hidden layers of binarized CNNs. Hidden layers support up to 512, 3 ×3 ×512 binary - input filters, and first layers support up to 64, 3×3 ×3 analog-input filters. Weight storage and multiplication with input activations is achieved withi...
Saved in:
Published in | 2018 IEEE Symposium on VLSI Circuits pp. 141 - 142 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.06.2018
|
Subjects | |
Online Access | Get full text |
DOI | 10.1109/VLSIC.2018.8502421 |
Cover
Summary: | We present a 65nm CMOS mixed-signal accelerator for first and hidden layers of binarized CNNs. Hidden layers support up to 512, 3 ×3 ×512 binary - input filters, and first layers support up to 64, 3×3 ×3 analog-input filters. Weight storage and multiplication with input activations is achieved within compact hardware, only 1.8 × larger than a 6T SRAM bit cell, and output activations are computed via capacitive charge sharing, requiring distribution of only a switch-control signal. Reduced data movement gives energy-efficiency of 658 (binary) / 0.95 TOPS/W and throughput of 9438 (binary) / 10.64 GOPS for hidden / first layers. |
---|---|
DOI: | 10.1109/VLSIC.2018.8502421 |