IEEE Access (Jan 2021)
Deep Learning Algorithms for the Work Function Fluctuation of Random Nanosized Metal Grains on Gate-All-Around Silicon Nanowire MOSFETs
Abstract
Device simulation has been explored and industrialized for over 40 years; however, it still requires huge computational cost. Therefore, it can be further advanced using deep learning (DL) algorithms. We for the first time report an efficient and accurate DL approach with device simulation for gate-all-around silicon nanowire metal-oxide-semiconductor field-effect transistors (MOSFETs) to predict electrical characteristics of device induced by work function fluctuation. By using three different DL models: artificial neural network (ANN), convolutional neural network (CNN), and long short term memory (LSTM), the variability of threshold voltage, on-current and off-current is predicted with respect to different metal-grain number and location of the low and high values of work function. The comparison is established among the ANN, CNN and the LSTM models and results depict that the CNN model outperforms in terms of the root mean squared error and the percentage error rate. The integration of device simulation with DL models exhibits the characteristic estimation of the explored device efficiently; and, the accurate prediction from the DL models can accelerate the process of device simulation. Notably, the DL approach is able to extract crucial electrical characteristics of a complicated device accurately with 2% error in a cost-effective manner computationally.
Keywords