1. Using Floating Gate Memory to Train Ideal Accuracy Neural Networks
- Author
-
Agarwal, Sapan, Garland, Diana, Niroula, John, B, Robin, Jacobs-Gedrim, Hsia, Alex, Van Heukelom, Michael S., Fuller, Elliot, Draper, Bruce, Marinella, Matthew J., Agarwal, Sapan, Garland, Diana, Niroula, John, B, Robin, Jacobs-Gedrim, Hsia, Alex, Van Heukelom, Michael S., Fuller, Elliot, Draper, Bruce, and Marinella, Matthew J.
- Abstract
Floating gate SONOS (Silicon-Oxygen-Nitrogen-Oxygen-Silicon) transistors can be used to train neural networks to ideal accuracies that match those of floating point digital weights on the MNIST dataset when using multiple devices to represent a weight or within 1% of ideal accuracy when using a single device. This is enabled by operating devices in the subthreshold regime, where they exhibit symmetric write nonlinearities. A neural training accelerator core based on SONOS with a single device per weight would increase energy efficiency by 120X, operate 2.1X faster and require 5X lower area than an optimized SRAM based ASIC.
- Published
- 2019