IEEE Access (Jan 2024)

A Quantized-Weight-Splitting Method of RRAM Arrays for Neuromorphic Applications

  • Kyungchul Park,
  • Sungjoon Kim,
  • Jong-Hyuk Park,
  • Woo Young Choi

DOI
https://doi.org/10.1109/ACCESS.2024.3394253
Journal volume & issue
Vol. 12
pp. 59680 – 59687

Abstract

Read online

In asynchronous Spiking Neural Networks (SNNs), the voltage division between passive resistive random-access memory (RRAM) arrays and neuron circuits presents a significant challenge, affecting the overall network accuracy and power efficiency. This study introduces the quantized-weight-splitting method (QWSM) as a novel solution to address this challenge. The QWSM optimizes and splits the quantized weights using the static read distortion owing to voltage division. The QWSM successfully guarantees inference accuracy and reduces power consumption. To validate the QWSM, the fabricated RRAM devices were measured, and a fitting model was carefully developed to describe their behavior, showing a strong correlation with measured data. In SNN simulations using the fitting model, the inference accuracy was improved across various weight quantization levels when the QWSM was applied. Moreover, the QWSM led to a substantial reduction in the average power consumption. Specifically, Compared to a network configured to have the smallest combined conductance of RRAMs for low-power operation, the network applying the QWSM showed a 12.56 -% reduction in average power per synapse. This power-saving feature, combined with improved accuracy, positions the QWSM as a valuable tool for efficient SNN design using passive RRAM arrays. Our findings highlight the potential of the QWSM in advancing neuromorphic computing with better energy efficiency and accuracy robustness.

Keywords