AIP Advances (Sep 2024)

Masked pretraining strategy for neural potentials

  • Zehua Zhang,
  • Zijie Li,
  • Amir Barati Farimani

DOI
https://doi.org/10.1063/5.0202647
Journal volume & issue
Vol. 14, no. 9
pp. 095031 – 095031-10

Abstract

Read online

We propose a masked pretraining method for Graph Neural Networks (GNNs) to improve their performance on fitting potential energy surfaces, particularly in water and small organic molecule systems. GNNs are pretrained by recovering the spatial information of masked-out atoms from molecules selected with certain ratios and then transferred and fine-tuned on atomic force fields. Through such pretraining, GNNs learn meaningful prior about the structural and underlying physical information of molecule systems that are useful for downstream tasks. With comprehensive experiments and ablation studies, we show that the proposed method improves both the accuracy and convergence speed of GNNs compared to their counterparts trained from scratch or with other pretraining techniques. This approach showcases its potential to enhance the performance and data efficiency of GNNs in fitting molecular force fields.