Advanced Intelligent Systems (Jun 2021)

Gradient Descent on Multilevel Spin–Orbit Synapses with Tunable Variations

  • Xiukai Lan,
  • Yi Cao,
  • Xiangyu Liu,
  • Kaijia Xu,
  • Chuan Liu,
  • Houzhi Zheng,
  • Kaiyou Wang

DOI
https://doi.org/10.1002/aisy.202000182
Journal volume & issue
Vol. 3, no. 6
pp. n/a – n/a

Abstract

Read online

Neuromorphic computing using multilevel nonvolatile memories as synapses offers opportunities for future energy‐ and area‐efficient artificial intelligence. Among these memories, artificial synapses based on current‐induced magnetization switching driven by spin–orbit torques (SOTs) have attracted great attention recently. Herein, the gradient descent algorithm, a primary learning algorithm, implemented on a 2 × 1 SOT synaptic array is reported. Successful pattern classifications are experimentally realized through the tuning of cycle‐to‐cycle variation, linearity range, and linearity deviation of the multilevel SOT synapse. Also, a larger m × n SOT synaptic array with m controlling transistors is proposed and it is found that the classification accuracies can be improved dramatically by decreasing the cycle‐to‐cycle variation. A way for the application of spin–orbit device arrays in neuromorphic computing is paved and the crucial importance of the cycle‐to‐cycle variation for a multilevel SOT synapse is suggested.

Keywords