Advanced Science (Jun 2022)

Nonideality‐Aware Training for Accurate and Robust Low‐Power Memristive Neural Networks

  • Dovydas Joksas,
  • Erwei Wang,
  • Nikolaos Barmpatsalos,
  • Wing H. Ng,
  • Anthony J. Kenyon,
  • George A. Constantinides,
  • Adnan Mehonic

DOI
https://doi.org/10.1002/advs.202105784
Journal volume & issue
Vol. 9, no. 17
pp. n/a – n/a

Abstract

Read online

Abstract Recent years have seen a rapid rise of artificial neural networks being employed in a number of cognitive tasks. The ever‐increasing computing requirements of these structures have contributed to a desire for novel technologies and paradigms, including memristor‐based hardware accelerators. Solutions based on memristive crossbars and analog data processing promise to improve the overall energy efficiency. However, memristor nonidealities can lead to the degradation of neural network accuracy, while the attempts to mitigate these negative effects often introduce design trade‐offs, such as those between power and reliability. In this work, authors design nonideality‐aware training of memristor‐based neural networks capable of dealing with the most common device nonidealities. The feasibility of using high‐resistance devices that exhibit high I‐V nonlinearity is demonstrated—by analyzing experimental data and employing nonideality‐aware training, it is estimated that the energy efficiency of memristive vector‐matrix multipliers is improved by almost three orders of magnitude (0.715 TOPs−1W−1 to 381 TOPs−1W−1) while maintaining similar accuracy. It is shown that associating the parameters of neural networks with individual memristors allows to bias these devices toward less conductive states through regularization of the corresponding optimization problem, while modifying the validation procedure leads to more reliable estimates of performance. The authors demonstrate the universality and robustness of this approach when dealing with a wide range of nonidealities.

Keywords