Energy Reports (Apr 2023)

Emergency load shedding strategy for high renewable energy penetrated power systems based on deep reinforcement learning

  • Hongwei Chen,
  • Junzhi Zhuang,
  • Gang Zhou,
  • Yuwei Wang,
  • Zhenglong Sun,
  • Yoash Levron

Journal volume & issue
Vol. 9
pp. 434 – 443

Abstract

Read online

Traditional event-driven emergency load shedding determines the quantitative strategy by simulation of a specific set of expected faults, which requires high model accuracy and operation mode matching. However, due to the model complexity of renewable power generators and fluctuating power generation, traditional event-driven load shedding strategy faces the risk of mismatching in high renewable energy penetrated power systems. To address these challenges, this paper proposes an emergency load shedding method based on data-driven strategies and deep reinforcement learning (RL). Firstly, the reason for the possible mismatch of the event-driven load shedding strategy in the renewable power system is analyzed, and a typical mismatch scenario is constructed. Then, the emergency load shedding strategy is transformed into a Markov Decision Process (MDP), and the decision process’s action space, state space, and reward function are designed. On this basis, an emergency control strategy platform based on the Gym framework is established for application of deep reinforcement learning in the power system emergency control strategy. In order to enhance the adaptability and efficiency of the RL intelligence agent to multi-fault scenarios, the Proximal Policy Optimization (PPO) is adopted to optimize the constructed MDP. Finally, the proposed reinforcement learning-based emergency load shedding strategy is trained and verified through a modified IEEE 39-bus system. The results show that the proposed strategy can effectively make correct strategies to restore system frequency in the event-driven load shedding mismatch scenario, and have good adaptability for different faults and operation scenarios.

Keywords