IEEE Access (Jan 2024)
Optimizing Memristor-Based Synaptic Devices for Enhanced Energy Efficiency and Accuracy in Neuromorphic Machine Learning
Abstract
The traditional Von Neumann computing architecture, which necessitates data transfer between external memory and the processor, incurs significant energy and time costs when running deep learning (DL) and machine learning (ML) architectures. The primary issue with the energy and time efficiency of this architecture stems from the frequent and intensive data transfers between memory and the processor. Therefore, memristive synaptic devices are utilized to overcome this energy and time inefficiency while performing cognitive tasks. The fundamental working principle of memristive devices is to reduce the need for data transfer by combining memory and processing in the same location, thereby significantly decreasing both energy consumption and the time required for operations. However, to achieve the desired level of efficiency in terms of energy and time consumption from neuromorphic systems, the performance of these systems needs to be further improved with respect to accuracy and test error rates for classification applications. Achieving high accuracy performance in such deep learning or machine learning models necessitates optimization processes not only at the hardware level but also at the algorithmic level. In this context, this paper presents a comprehensive examination and comparison of the frequently used SGD and its momentum variants for deep learning and machine learning applications in memristor-based neuromorphic computing systems. The study thoroughly investigates the performance of critical metrics such as the learning properties, energy efficiency, and accuracy rates of the nano-scale titanium dioxide $(TiO_{2})$ based synaptic device. The experimental results for the MNIST dataset showed AdaDelta 89.48%, AdaGrad 79.00%, Adam 79.13%, AdaMax 79.68%, Momentum 88.55%, Nadam 81.20%, RMSprop 84.91% and SGD 89.47% accuracy. The experimental results for the CIFAR dataset showed AdaDelta 90.51%, AdaGrad 82.08%, Adam 83.10%, AdaMax 81.76%, Momentum 91.25%, Nadam 82.45%, RMSprop 88.11% and SGD 90.21% accuracy.
Keywords