Advanced Intelligent Systems (Sep 2021)

Hardware‐Friendly Stochastic and Adaptive Learning in Memristor Convolutional Neural Networks

  • Wei Zhang,
  • Lunshuai Pan,
  • Xuelong Yan,
  • Guangchao Zhao,
  • Hong Chen,
  • Xingli Wang,
  • Beng Kang Tay,
  • Gaokuo Zhong,
  • Jiangyu Li,
  • Mingqiang Huang

DOI
https://doi.org/10.1002/aisy.202100041
Journal volume & issue
Vol. 3, no. 9
pp. n/a – n/a

Abstract

Read online

Memristors offer great advantages as a new hardware solution for neuromorphic computing due to their fast and energy‐efficient matrix vector multiplication. However, the nonlinear weight updating property of memristors makes it difficult to be trained in a neural network learning process. Several compensation schemes have been proposed to mitigate the updating error caused by nonlinearity; nevertheless, they usually involve complex peripheral circuits design. Herein, stochastic and adaptive learning methods for weight updating are developed, in which the inaccuracy caused by the memristor nonlinearity can be effectively suppressed. In addition, compared with the traditional nonlinear stochastic gradient descent (SGD) updating algorithm or the piecewise linear (PL) method, which are most often used in memristor neural network, the design is more hardware friendly and energy efficient without the consideration of pulse numbers, duration, and directions. Effectiveness of the proposed method is investigated on the training of LeNet‐5 convolutional neural network. High accuracy, about 93.88%, on the Modified National Institute of Standards and Technology handwriting digits datasets is achieved (with typical memristor nonlinearity as ±1), which is close to the network with complex PL method (94.7%) and is higher than the original nonlinear SGD method (90.14%).

Keywords