IEEE Journal of the Electron Devices Society (Jan 2024)

Large-Scale Training in Neural Compact Models for Accurate and Adaptable MOSFET Simulation

  • Chanwoo Park,
  • Seungjun Lee,
  • Junghwan Park,
  • Kyungjin Rim,
  • Jihun Park,
  • Seonggook Cho,
  • Jongwook Jeon,
  • Hyunbo Cho

DOI
https://doi.org/10.1109/JEDS.2024.3417521
Journal volume & issue
Vol. 12
pp. 745 – 751

Abstract

Read online

We address the challenges associated with traditional analytical models, such as BSIM, in semiconductor device modeling. These models often face limitations in accurately representing the complex behaviors of miniaturized devices. As an alternative, Neural Compact Models (NCMs) offer improved modeling capabilities, but their effectiveness is constrained by a reliance on extensive datasets for accurate performance. In real-world scenarios, where measurements for device modeling are often limited, this dependence becomes a significant hindrance. In response, this work presents a large-scale pre-training approach for NCMs. By utilizing extensive datasets across various technology nodes, our method enables NCMs to develop a more detailed understanding of device behavior, thereby enhancing the accuracy and adaptability of MOSFET device simulations, particularly when data availability is limited. Our study illustrates the potential benefits of large-scale pre-training in enhancing the capabilities of NCMs, offering a practical solution to one of the key challenges in current device modeling practices.

Keywords