Energies (Dec 2022)

Handling Computation Hardness and Time Complexity Issue of Battery Energy Storage Scheduling in Microgrids by Deep Reinforcement Learning

  • Zeyue Sun,
  • Mohsen Eskandari,
  • Chaoran Zheng,
  • Ming Li

DOI
https://doi.org/10.3390/en16010090
Journal volume & issue
Vol. 16, no. 1
p. 90

Abstract

Read online

With the development of microgrids (MGs), an energy management system (EMS) is required to ensure the stable and economically efficient operation of the MG system. In this paper, an intelligent EMS is proposed by exploiting the deep reinforcement learning (DRL) technique. DRL is employed as the effective method for handling the computation hardness of optimal scheduling of the charge/discharge of battery energy storage in the MG EMS. Since the optimal decision for charge/discharge of the battery depends on its state of charge given from the consecutive time steps, it demands a full-time horizon scheduling to obtain the optimum solution. This, however, increases the time complexity of the EMS and turns it into an NP-hard problem. By considering the energy storage system’s charging/discharging power as the control variable, the DRL agent is trained to investigate the best energy storage control method for both deterministic and stochastic weather scenarios. The efficiency of the strategy suggested in this study in minimizing the cost of purchasing energy is also shown from a quantitative perspective through programming verification and comparison with the results of mixed integer programming and the heuristic genetic algorithm (GA).

Keywords