Power- and Endurance-Aware Neural Network Training in NVM-Based Platforms
Neural networks (NNs) have become the go-to tool for solving many real-world recognition and classification tasks with massive and complex data sets. These networks require large data sets for training, which is usually performed on GPUs and CPUs in either a cloud or edge computing setting. No matte...
Saved in:
Published in | IEEE transactions on computer-aided design of integrated circuits and systems Vol. 37; no. 11; pp. 2709 - 2719 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.11.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Neural networks (NNs) have become the go-to tool for solving many real-world recognition and classification tasks with massive and complex data sets. These networks require large data sets for training, which is usually performed on GPUs and CPUs in either a cloud or edge computing setting. No matter where the training is performed, it is subject to tight power/energy and data storage/transfer constraints. While these issues can be mitigated by replacing SRAM/DRAM with nonvolatile memories (NVMs) which offer near-zero leakage power and high scalability, the massive weight updates performed during training shorten NVM endurance and engender high write energy. In this paper, an NVM-friendly NN training approach is proposed. Weight update is redesigned to reduce bit flips in NVM cells. Moreover, two techniques, namely, filter exchange and bitwise rotation, are proposed to respectively balance writes to different weights and to different bits of one weight. The proposed techniques are integrated and evaluated in Caffe. Experimental results show significant power savings and endurance improvements, while maintaining high inference accuracy. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0278-0070 1937-4151 |
DOI: | 10.1109/TCAD.2018.2858360 |