Journal of Engineering Science (Chişinău) (May 2020)
STOCHASTIC OPTIMAL CONTROL OF A TWO-DIMENSIONAL DYNAMICAL SYSTEM
Abstract
In this paper, we considered the problem of optimally controlling a twodimensional dynamical system until it reaches either of two boundaries. We consider a controlled dynamical system (X(t), Y(t)) which is a generalization of the classic twodimensional Kermack-McKendrick model for the spread of epidemics. Moreover, the system is subject to random jumps of fixed size according to a Poisson process. The system is controlled until the sum X(t) + Y(t) equal to either 0 or d (> 0) for the first time. Particular problems are solved explicitly.
Keywords