Mathematics (Nov 2019)

Numerical Simulation of Feller’s Diffusion Equation

  • Denys Dutykh

DOI
https://doi.org/10.3390/math7111067
Journal volume & issue
Vol. 7, no. 11
p. 1067

Abstract

Read online

This article is devoted to Feller’s diffusion equation, which arises naturally in probability and physics (e.g., wave turbulence theory). If discretized naively, this equation may represent serious numerical difficulties since the diffusion coefficient is practically unbounded and most of its solutions are weakly divergent at the origin. In order to overcome these difficulties, we reformulate this equation using some ideas from the Lagrangian fluid mechanics. This allows us to obtain a numerical scheme with a rather generous stability condition. Finally, the algorithm admits an elegant implementation, and the corresponding Matlab code is provided with this article under an open source license.

Keywords