Advanced Intelligent Systems (Jan 2024)

Binary‐Stochasticity‐Enabled Highly Efficient Neuromorphic Deep Learning Achieves Better‐than‐Software Accuracy

  • Yang Li,
  • Wei Wang,
  • Ming Wang,
  • Chunmeng Dou,
  • Zhengyu Ma,
  • Huihui Zhou,
  • Peng Zhang,
  • Nicola Lepri,
  • Xumeng Zhang,
  • Qing Luo,
  • Xiaoxin Xu,
  • Guanhua Yang,
  • Feng Zhang,
  • Ling Li,
  • Daniele Ielmini,
  • Ming Liu

DOI
https://doi.org/10.1002/aisy.202300399
Journal volume & issue
Vol. 6, no. 1
pp. n/a – n/a

Abstract

Read online

In this work, the requirement of using high‐precision (HP) signals is lifted and the circuits for implementing deep learning algorithms in memristor‐based hardware are simplified. The use of HP signals is required by the backpropagation learning algorithm since the gradient descent learning rule relies on the chain product of partial derivatives. However, it is both challenging and biologically implausible to implement such an HP algorithm in noisy and analog memristor‐based hardware systems. Herein, it is demonstrated that the requirement for HP signals handling is not necessary and more efficient deep learning can be achieved when using a binary stochastic learning algorithm. The new algorithm proposed in this work modifies elementary neural network operations, which improves energy efficiency by two orders of magnitude compared to traditional memristor‐based hardware and three orders of magnitude compared to complementary metal–oxide–semiconductor‐based hardware. It also provides better accuracy in pattern recognition tasks than the HP learning algorithm benchmarks.

Keywords