Artificial Intelligence Chemistry (Jun 2024)

A general strategy for improving the performance of PINNs -- Analytical gradients and advanced optimizers in the NeuralSchrödinger framework

  • Jakob Gamper,
  • Hans Georg Gallmetzer,
  • Alexander K.H. Weiss,
  • Thomas S. Hofer

Journal volume & issue
Vol. 2, no. 1
p. 100047

Abstract

Read online

In this work, the previously introduced NeuralSchrödinger PINN is extended towards the use of analytical gradient expressions of the loss function. It is shown that the analytical gradients derived in this work increase the convergence properties for both the BFGS and ADAM optimizers compared to the previously employed numerical gradient implementation. In addition, the use of parallelised GPU computations via CUDA greatly increased the computational performance over the previous implementation using single-core CPU computations. As a consequence, an extension of the NeuralSchrödinger PINN towards two-dimensional quantum systems became feasible as also demonstrated in this work.

Keywords