IEEE Journal on Exploratory Solid-State Computational Devices and Circuits (Jan 2019)

Using Floating-Gate Memory to Train Ideal Accuracy Neural Networks

  • Sapan Agarwal,
  • Diana Garland,
  • John Niroula,
  • Robin B. Jacobs-Gedrim,
  • Alex Hsia,
  • Michael S. Van Heukelom,
  • Elliot Fuller,
  • Bruce Draper,
  • Matthew J. Marinella

DOI
https://doi.org/10.1109/JXCDC.2019.2902409
Journal volume & issue
Vol. 5, no. 1
pp. 52 – 57

Abstract

Read online

Floating-gate silicon-oxygen-nitrogen-oxygen-silicon (SONOS) transistors can be used to train neural networks to ideal accuracies that match those of floating-point digital weights on the MNIST handwritten digit data set when using multiple devices to represent a weight or within 1% of ideal accuracy when using a single device. This is enabled by operating devices in the subthreshold regime, where they exhibit symmetric write nonlinearities. A neural training accelerator core based on SONOS with a single device per weight would increase energy efficiency by 120×, operate 2.1× faster, and require 5× lower area than an optimized SRAM-based ASIC.

Keywords