The Astrophysical Journal (Jan 2024)
Modeling the Time Evolution of Compact Binary Systems with Machine Learning
Abstract
This work introduces advanced computational techniques for modeling the time evolution of compact binary systems using machine learning. The dynamics of compact binary systems, such as black holes and neutron stars, present significant nonlinear challenges due to the strong gravitational interactions and the requirement for precise numerical simulations. Traditional methods, like the post-Newtonian approximation, often require significant computational resources and face challenges in accuracy and efficiency. Here, we employed machine learning algorithms, including deep learning models like long short-term memory (LSTM) and temporal convolutional network (TCN), to predict the future evolution of these systems based on extensive simulation data. Our results demonstrate that employing both LSTM and TCN even as black-box predictors for sequence prediction can also significantly improve the prediction accuracy without physics-informed neural networks (as partial differential equation solvers with prior knowledge or inductive bias. By employing LSTM and TCN, we obtained R ^2 values of 99.74% and 99.19% for the evolutionary orbits of the compact binaries data set, respectively. Our models demonstrate the ability to effectively capture the dynamics of the binaries, achieving high prediction performance with significantly reduced computational overhead by a factor of 40, compared to conventional numerical methods. This study paves the way for more effective and computationally scalable approaches to the understanding of gravitational phenomena and predictive modeling in gravitational-wave astronomy.
Keywords