AIP Advances (Feb 2022)
Comparison of update and genetic training algorithms in a memristor crossbar perceptron
Abstract
Memristor-based computer architectures are becoming more attractive as a possible choice of hardware for the implementation of neural networks. However, at present, memristor technologies are susceptible to a variety of failure modes, a serious concern in any application where regular access to the hardware may not be expected or even possible. In this study, we investigate whether certain training algorithms may be more resilient to particular hardware failure modes and, therefore, more suitable for use in those applications. We implement two training algorithms—a local update scheme and a genetic algorithm—in a simulated memristor crossbar and compare their ability to train for a simple image classification task as an increasing number of memristors fail to adjust their conductance. We demonstrate that there is a clear distinction between the two algorithms in several measures of the rate of failure to train.