npj Computational Materials (Jul 2024)

Learning together: Towards foundation models for machine learning interatomic potentials with meta-learning

  • Alice E. A. Allen,
  • Nicholas Lubbers,
  • Sakib Matin,
  • Justin Smith,
  • Richard Messerly,
  • Sergei Tretiak,
  • Kipton Barros

DOI
https://doi.org/10.1038/s41524-024-01339-x
Journal volume & issue
Vol. 10, no. 1
pp. 1 – 9

Abstract

Read online

Abstract The development of machine learning models has led to an abundance of datasets containing quantum mechanical (QM) calculations for molecular and material systems. However, traditional training methods for machine learning models are unable to leverage the plethora of data available as they require that each dataset be generated using the same QM method. Taking machine learning interatomic potentials (MLIPs) as an example, we show that meta-learning techniques, a recent advancement from the machine learning community, can be used to fit multiple levels of QM theory in the same training process. Meta-learning changes the training procedure to learn a representation that can be easily re-trained to new tasks with small amounts of data. We then demonstrate that meta-learning enables simultaneously training to multiple large organic molecule datasets. As a proof of concept, we examine the performance of a MLIP refit to a small drug-like molecule and show that pre-training potentials to multiple levels of theory with meta-learning improves performance. This difference in performance can be seen both in the reduced error and in the improved smoothness of the potential energy surface produced. We therefore show that meta-learning can utilize existing datasets with inconsistent QM levels of theory to produce models that are better at specializing to new datasets. This opens new routes for creating pre-trained, foundation models for interatomic potentials.