IEEE Access (Jan 2021)
Transmission Network Dynamic Planning Based on a Double Deep-Q Network With Deep ResNet
Abstract
Based on a Double Deep-Q Network with deep ResNet (DDQN-ResNet), this paper proposes a novel method for transmission network expansion planning (TNEP). Since TNEP is a large scale and mixed-integer linear programming (MILP) problem, as the transmission network scale and the optimal constraints increase, the numerical calculation and heuristic learning-based methods suffer from heavy computational complexities in calculation and training. Besides, due to the black box characteristic, the solution processes of the heuristic learning-based methods are inexplicable and usually require repeated training. By using DDQN-ResNet, this paper constructs a high-performance and flexible method to solve large-scale and complex-constrained TNEP problem. Firstly, we form a two-objective TNEP model, in which one objective is to minimize the comprehensive cost, and another objective is to maximize the transmission network reliability. The comprehensive cost takes into account the expansion cost, the network loss cost, and the maintenance cost. The transmission network reliability is evaluated by the expected energy not served (EENS) and the electrical betweenness. Secondly, from the TNEP model, the TNEP task is constructed based on the Markov decision process. By abstracting the task, the TNEP environment is obtained for DDQN-ResNet. In addition, to identify the construction value of lines, an agent is establish based on DDQN-ResNet. Finally, we perform the static planning and visualize the reinforcement learning process. The dynamic planning is realized by reusing the training experience. The validity and flexibility of DDQN-ResNet are verified on RTS 24-bus test system.
Keywords