Using Floating-Gate Memory to Train Ideal Accuracy Neural Networks

Floating-gate silicon-oxygen-nitrogen-oxygen-silicon (SONOS) transistors can be used to train neural networks to ideal accuracies that match those of floating-point digital weights on the MNIST handwritten digit data set when using multiple devices to represent a weight or within 1% of ideal accurac...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal on exploratory solid-state computational devices and circuits Vol. 5; no. 1; pp. 52 - 57
Main Authors Agarwal, Sapan, Garland, Diana, Niroula, John, Jacobs-Gedrim, Robin B., Hsia, Alex, Van Heukelom, Michael S., Fuller, Elliot, Draper, Bruce, Marinella, Matthew J.
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.06.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Floating-gate silicon-oxygen-nitrogen-oxygen-silicon (SONOS) transistors can be used to train neural networks to ideal accuracies that match those of floating-point digital weights on the MNIST handwritten digit data set when using multiple devices to represent a weight or within 1% of ideal accuracy when using a single device. This is enabled by operating devices in the subthreshold regime, where they exhibit symmetric write nonlinearities. A neural training accelerator core based on SONOS with a single device per weight would increase energy efficiency by 120×, operate 2.1× faster, and require 5× lower area than an optimized SRAM-based ASIC.
Bibliography:AC04-94AL85000; NA0003525
USDOE National Nuclear Security Administration (NNSA)
SAND-2019-0981J
ISSN:2329-9231
2329-9231
DOI:10.1109/JXCDC.2019.2902409