npj Computational Materials (Nov 2021)

Benchmarking the performance of Bayesian optimization across multiple experimental materials science domains

  • Qiaohao Liang,
  • Aldair E. Gongora,
  • Zekun Ren,
  • Armi Tiihonen,
  • Zhe Liu,
  • Shijing Sun,
  • James R. Deneault,
  • Daniil Bash,
  • Flore Mekki-Berrada,
  • Saif A. Khan,
  • Kedar Hippalgaonkar,
  • Benji Maruyama,
  • Keith A. Brown,
  • John Fisher III,
  • Tonio Buonassisi

DOI
https://doi.org/10.1038/s41524-021-00656-9
Journal volume & issue
Vol. 7, no. 1
pp. 1 – 10

Abstract

Read online

Abstract Bayesian optimization (BO) has been leveraged for guiding autonomous and high-throughput experiments in materials science. However, few have evaluated the efficiency of BO across a broad range of experimental materials domains. In this work, we quantify the performance of BO with a collection of surrogate model and acquisition function pairs across five diverse experimental materials systems. By defining acceleration and enhancement metrics for materials optimization objectives, we find that surrogate models such as Gaussian Process (GP) with anisotropic kernels and Random Forest (RF) have comparable performance in BO, and both outperform the commonly used GP with isotropic kernels. GP with anisotropic kernels has demonstrated the most robustness, yet RF is a close alternative and warrants more consideration because it is free from distribution assumptions, has smaller time complexity, and requires less effort in initial hyperparameter selection. We also raise awareness about the benefits of using GP with anisotropic kernels in future materials optimization campaigns.