24.4 Sandwich-RAM: An Energy-Efficient In-Memory BWN Architecture with Pulse-Width Modulation
Convolutional neural networks (CNN) achieve state-of-the-art results in the field of visual perception, drastically changing the traditional computer-vision framework. However, the movement of massive amounts of data prevents CNN's from being integrated into low-power IoT devices. The recently...
Saved in:
Published in | Digest of technical papers - IEEE International Solid-State Circuits Conference pp. 394 - 396 |
---|---|
Main Authors | , , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.02.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Convolutional neural networks (CNN) achieve state-of-the-art results in the field of visual perception, drastically changing the traditional computer-vision framework. However, the movement of massive amounts of data prevents CNN's from being integrated into low-power IoT devices. The recently proposed binaryweight network (BWN) reduces the complexity of computation and amount of memory access. A conventional digital implementation, which is composed of separate feature/weight memories and a multiply-and-accumulate (MAC) unit, requires large amounts of data to be moved [3]. To reduce power the weight memory and the computations are integrated together, into an in-memory computation architecture [1, 2, 5]. However, feature data is still stored externally, so data movement has only been partially addressed, especially for BWN. This paper blends feature and partial-weight memory with a computing circuit together, like a sandwich, that achieves significantly less data access (Fig. 24.4.1). It also uses a reconfigurable analog-computation engine, based on pulse-width modulation, that is small and flexible enough to be inserted into the memory. |
---|---|
ISSN: | 2376-8606 |
DOI: | 10.1109/ISSCC.2019.8662435 |