PLoS ONE (Jan 2021)

A data assimilation framework that uses the Kullback-Leibler divergence.

  • Sam Pimentel,
  • Youssef Qranfal

DOI
https://doi.org/10.1371/journal.pone.0256584
Journal volume & issue
Vol. 16, no. 8
p. e0256584

Abstract

Read online

The process of integrating observations into a numerical model of an evolving dynamical system, known as data assimilation, has become an essential tool in computational science. These methods, however, are computationally expensive as they typically involve large matrix multiplication and inversion. Furthermore, it is challenging to incorporate a constraint into the procedure, such as requiring a positive state vector. Here we introduce an entirely new approach to data assimilation, one that satisfies an information measure and uses the unnormalized Kullback-Leibler divergence, rather than the standard choice of Euclidean distance. Two sequential data assimilation algorithms are presented within this framework and are demonstrated numerically. These new methods are solved iteratively and do not require an adjoint. We find them to be computationally more efficient than Optimal Interpolation (3D-Var solution) and the Kalman filter whilst maintaining similar accuracy. Furthermore, these Kullback-Leibler data assimilation (KL-DA) methods naturally embed constraints, unlike Kalman filter approaches. They are ideally suited to systems that require positive valued solutions as the KL-DA guarantees this without need of transformations, projections, or any additional steps. This Kullback-Leibler framework presents an interesting new direction of development in data assimilation theory. The new techniques introduced here could be developed further and may hold potential for applications in the many disciplines that utilize data assimilation, especially where there is a need to evolve variables of large-scale systems that must obey physical constraints.