Energies (Nov 2021)

Operation of Distributed Battery Considering Demand Response Using Deep Reinforcement Learning in Grid Edge Control

  • Wenying Li,
  • Ming Tang,
  • Xinzhen Zhang,
  • Danhui Gao,
  • Jian Wang

DOI
https://doi.org/10.3390/en14227749
Journal volume & issue
Vol. 14, no. 22
p. 7749

Abstract

Read online

Battery energy storage systems (BESSs) are able to facilitate economical operation of the grid through demand response (DR), and are regarded as the most significant DR resource. Among them, distributed BESS integrating home photovoltaics (PV) have developed rapidly, and account for nearly 40% of newly installed capacity. However, the use scenarios and use efficiency of distributed BESS are far from sufficient to be able to utilize the potential loads and overcome uncertainties caused by disorderly operation. In this paper, the low-voltage transformer-powered area (LVTPA) is firstly defined, and then a DR grid edge controller was implemented based on deep reinforcement learning to maximize the total DR benefits and promote three-phase balance in the LVTPA. The proposed DR problem is formulated as a Markov decision process (MDP). In addition, the deep deterministic policy gradient (DDPG) algorithm is applied to train the controller in order to learn the optimal DR strategy. Additionally, a life cycle cost model of the BESS is established and implemented in the DR scheme to measure the income. The numerical results, compared to deep Q learning and model-based methods, demonstrate the effectiveness and validity of the proposed method.

Keywords