Machine Learning: Science and Technology (Jan 2024)

Generalization of graph-based active learning relaxation strategies across materials

  • Xiaoxiao Wang,
  • Joseph Musielewicz,
  • Richard Tran,
  • Sudheesh Kumar Ethirajan,
  • Xiaoyan Fu,
  • Hilda Mera,
  • John R Kitchin,
  • Rachel C Kurchin,
  • Zachary W Ulissi

DOI
https://doi.org/10.1088/2632-2153/ad37f0
Journal volume & issue
Vol. 5, no. 2
p. 025018

Abstract

Read online

Although density functional theory (DFT) has aided in accelerating the discovery of new materials, such calculations are computationally expensive, especially for high-throughput efforts. This has prompted an explosion in exploration of machine learning (ML) assisted techniques to improve the computational efficiency of DFT. In this study, we present a comprehensive investigation of the broader application of Finetuna, an active learning framework to accelerate structural relaxation in DFT with prior information from Open Catalyst Project pretrained graph neural networks. We explore the challenges associated with out-of-domain systems: alcohol ( $C_{\gt2}$ ) on metal surfaces as larger adsorbates, metal oxides with spin polarization, and three-dimensional (3D) structures like zeolites and metal organic frameworks. By pre-training ML models on large datasets and fine-tuning the model along the simulation, we demonstrate the framework’s ability to conduct relaxations with fewer DFT calculations. Depending on the similarity of the test systems to the training systems, a more conservative querying strategy is applied. Our best-performing Finetuna strategy reduces the number of DFT single-point calculations by 80% for alcohols and 3D structures, and 42% for oxide systems.

Keywords