Complex & Intelligent Systems (Jun 2024)

Improving two-layer encoding of evolutionary algorithms for sparse large-scale multiobjective optimization problems

  • Jing Jiang,
  • Huoyuan Wang,
  • Juanjuan Hong,
  • Zhe Liu,
  • Fei Han

DOI
https://doi.org/10.1007/s40747-024-01489-x
Journal volume & issue
Vol. 10, no. 5
pp. 6319 – 6337

Abstract

Read online

Abstract Sparse large-scale multiobjective problems (LSMOPs) are characterized as an NP-hard issue that undergoes a significant presence of zero-valued variables in Pareto optimal solutions. In solving sparse LSMOPs, recent studies typically employ a specialized two-layer encoding, where the low-level layer undertakes the optimization of zero variables and the high-level layer is in charge of non-zero variables. However, such an encoding usually puts the low-level layer in the first place and thus cannot achieve a balance between optimizing zero and non-zero variables. To this end, this paper proposes to build a two-way association between the two layers using a mutual preference calculation method and a two-way matching strategy. Essentially, the two-way association balances the influence of two layers on the encoded individual by relaxing the control of the low-level layer and enhancing the control of the high-level layer, thus reaching the balance between the optimizations of zero and non-zero variables. Moreover, we propose a new evolutionary algorithm equipped with the modules and compare it with several state-of-the-art algorithms on 32 benchmark problems. Extensive experiments verify its effectiveness, as the proposed modules can improve the two-layer encoding and help the algorithm achieve superior performance on sparse LSMOPs.

Keywords