IEEE Open Journal of the Industrial Electronics Society (Jan 2024)
Explainable Model Prediction of Memristor
Abstract
System level simulation of neuro-memristive circuits under variability are complex and follow a black-box neural network approach. In realistic hardware, they are often difficult to cross-check for accuracy and reproducible results. The accurate memristor model prediction becomes critical to decipher the overall circuit function in a wide range of nonideal and practical conditions. In most neuro-memristive systems, crossbar configuration is essential for implementing multiply and accumulate calculations, that form the primary unit for neural network implementations. Predicting the specific memristor model that best fits the crossbar simulations to make it explainable is an open challenge that is solved in this article. As the size of the crossbar increases the cross-validation becomes even more challenging. This article proposes predicting the memristor device under test by automatically evaluating the I–V behavior using random forest and extreme gradient boosting algorithms. Starting with a single memristor model, the prediction approach is extended to memristor crossbar-based circuits explainable. The performance of both algorithms is analyzed based on precision, recall, f1-score, and support. The accuracy, macro average, and weighted average of both algorithms at different operational frequencies are explored.
Keywords