Energy Conversion and Economics (Dec 2022)

Mobile battery energy storage system control with knowledge‐assisted deep reinforcement learning

  • Huan Zhao,
  • Zifan Liu,
  • Xuan Mai,
  • Junhua Zhao,
  • Jing Qiu,
  • Guolong Liu,
  • Zhao Yang Dong,
  • Amer M. Y. M. Ghias

DOI
https://doi.org/10.1049/enc2.12075
Journal volume & issue
Vol. 3, no. 6
pp. 381 – 391

Abstract

Read online

Abstract Most mobile battery energy storage systems (MBESSs) are designed to enhance power system resilience and provide ancillary service for the system operator using energy storage. As the penetration of renewable energy and fluctuation of the electricity price increase in the power system, the demand‐side commercial entities can be more profitable utilizing the mobility and flexibility of MBESSs compared to the stational energy storage system. The profit is closely related to the spatiotemporal decision model and is influenced by environmental uncertainties, such as electricity price and traffic conditions. However, solving the real‐time control problem considering long‐term profit and uncertainties is time‐consuming. To address this problem, this paper proposes a deep reinforcement learning framework for MBESSs to maximize profit through market arbitrage. A knowledge‐assisted double deep Q network (KA‐DDQN) algorithm is proposed based on such framework to learn the optimal policy and increase the learning efficiency. Moreover, two criteria action generation methods of knowledge‐assisted learning are proposed for integer actions utilizing scheduling and short‐term programming results. Simulation results show that the proposed framework and method can achieve the optimal result, and KA‐DDQN can accelerate the learning process compared to the original method by approximately 30%.

Keywords