Chaos, Solitons & Fractals: X (Mar 2020)
The scaling of physics-informed machine learning with data and dimensions
Abstract
We quantify how incorporating physics into neural network design can significantly improve the learning and forecasting of dynamical systems, even nonlinear systems of many dimensions. We train conventional and Hamiltonian neural networks on increasingly difficult dynamical systems and compute their forecasting errors as the number of training data and number of system dimensions vary. A map-building perspective elucidates the superiority of Hamiltonian neural networks. The results clarify the critical relation among data, dimension, and neural network learning performance.