Frontiers in Nanotechnology (Oct 2022)

Bayesian neural networks using magnetic tunnel junction-based probabilistic in-memory computing

  • Samuel Liu,
  • T. Patrick Xiao,
  • Jaesuk Kwon,
  • Bert J. Debusschere,
  • Sapan Agarwal,
  • Jean Anne C. Incorvia,
  • Christopher H. Bennett

DOI
https://doi.org/10.3389/fnano.2022.1021943
Journal volume & issue
Vol. 4

Abstract

Read online

Bayesian neural networks (BNNs) combine the generalizability of deep neural networks (DNNs) with a rigorous quantification of predictive uncertainty, which mitigates overfitting and makes them valuable for high-reliability or safety-critical applications. However, the probabilistic nature of BNNs makes them more computationally intensive on digital hardware and so far, less directly amenable to acceleration by analog in-memory computing as compared to DNNs. This work exploits a novel spintronic bit cell that efficiently and compactly implements Gaussian-distributed BNN values. Specifically, the bit cell combines a tunable stochastic magnetic tunnel junction (MTJ) encoding the trained standard deviation and a multi-bit domain-wall MTJ device independently encoding the trained mean. The two devices can be integrated within the same array, enabling highly efficient, fully analog, probabilistic matrix-vector multiplications. We use micromagnetics simulations as the basis of a system-level model of the spintronic BNN accelerator, demonstrating that our design yields accurate, well-calibrated uncertainty estimates for both classification and regression problems and matches software BNN performance. This result paves the way to spintronic in-memory computing systems implementing trusted neural networks at a modest energy budget.

Keywords