Energies (May 2024)

Online EVs Vehicle-to-Grid Scheduling Coordinated with Multi-Energy Microgrids: A Deep Reinforcement Learning-Based Approach

  • Weiqi Pan,
  • Xiaorong Yu,
  • Zishan Guo,
  • Tao Qian,
  • Yang Li

DOI
https://doi.org/10.3390/en17112491
Journal volume & issue
Vol. 17, no. 11
p. 2491

Abstract

Read online

The integration of electric vehicles (EVs) into vehicle-to-grid (V2G) scheduling offers a promising opportunity to enhance the profitability of multi-energy microgrid operators (MMOs). MMOs aim to maximize their total profits by coordinating V2G scheduling and multi-energy flexible loads of end-users while adhering to operational constraints. However, scheduling V2G strategies online poses challenges due to uncertainties such as electricity prices and EV arrival/departure patterns. To address this, we propose an online V2G scheduling framework based on deep reinforcement learning (DRL) to optimize EV battery utilization in microgrids with different energy sources. Firstly, our approach proposes an online scheduling model that integrates the management of V2G and multi-energy flexible demands, modeled as a Markov Decision Process (MDP) with an unknown transition. Secondly, a DRL-based Soft Actor-Critic (SAC) algorithm is utilized to efficiently train neural networks and dynamically schedule EV charging and discharging activities in response to real-time grid conditions and energy demand patterns. Extensive simulations are conducted in case studies to testify to the effectiveness of our proposed approach. The overall results validate the efficacy of the DRL-based online V2G scheduling framework, highlighting its potential to drive profitability and sustainability in multi-energy microgrid operations.

Keywords