IEEE Open Journal of Nanotechnology (Jan 2023)

A Comprehensive Technique Based on Machine Learning for Device and Circuit Modeling of Gate-All-Around Nanosheet Transistors

  • Rajat Butola,
  • Yiming Li,
  • Sekhar Reddy Kola

DOI
https://doi.org/10.1109/OJNANO.2023.3328425
Journal volume & issue
Vol. 4
pp. 181 – 194

Abstract

Read online

Machine learning (ML) is poised to play an important part in advancing the predicting capability in semiconductor device compact modeling domain. One major advantage of ML-based compact modeling is its ability to capture complex relationships and patterns in large datasets. Therefore, in this paper a novel design scheme based on dynamically adaptive neural network (DANN) is proposed to develop fast and accurate compact model (CM). This framework constitutes a powerful yet computationally efficient methodology and exhibits emergent dynamic behaviors. This paper demonstrates that the compact model based on ML can be designed to replicate the performance of conventional compact model for nanodevices. For this work, gate-all-around (GAA) nanosheet (NS) device characteristics are comprehensively analyzed for process variability sources using the proposed model. The device geometry parameters such as channel length, nanosheet width and nanosheet thickness are fed as input features to the DANN model. The adaptive neural network learns dynamically by updating weights of the model in accordance with the input features and achieves accurate neural weight convergence. The proposed model predicted the electrical characteristics of NS devices with less than 1% error rate. The model is also implemented and validated for the simulations of digital circuit designs such as inverter, and logic gates.

Keywords