Energies (Nov 2022)

Critical Reliability Improvement Using Q-Learning-Based Energy Management System for Microgrids

  • Lizon Maharjan,
  • Mark Ditsworth,
  • Babak Fahimi

DOI
https://doi.org/10.3390/en15238779
Journal volume & issue
Vol. 15, no. 23
p. 8779

Abstract

Read online

This paper presents a power distribution system that prioritizes the reliability of power to critical loads within a community. The proposed system utilizes reinforcement learning methods (Q-learning) to train multi-port power electronic interface (MPEI) systems within a community of microgrids. The primary contributions of this article are to present a system where Q-learning is successfully integrated with MPEI to reduce the impact of power contingencies on critical loads and to explore the effectiveness of the subsequent system. The feasibility of the proposed method has been proven through simulation and experiments. It has been demonstrated that the proposed method can effectively improve the reliability of the local power system—for a case study where 20% of the total loads are classified as critical loads, the system average interruption duration index (SAIDI) has been improved by 75% compared to traditional microgrids with no load schedule.

Keywords