AIP Advances (Dec 2021)

Shared seagull optimization algorithm with mutation operators for global optimization

  • Bing Ma,
  • Peng-min Lu,
  • Yong-gang Liu,
  • Qiang Zhou,
  • Yong-tao Hu

DOI
https://doi.org/10.1063/5.0073335
Journal volume & issue
Vol. 11, no. 12
pp. 125217 – 125217-27

Abstract

Read online

Seagull optimization algorithm (SOA) has the disadvantages of low convergence accuracy, weak population diversity, and tendency to fall into local optimum, especially for high dimensional and multimodal problems. To overcome these shortcomings, initially, in this study, a shared SOA (SSOA) is proposed based on the combination of a sharing multi-leader strategy with a self-adaptive mutation operator. In addition, seven new variants of the SSOA algorithm are proposed employing the Gaussian mutation operator, Cauchy mutation operator, Lévy flights mutation operator, improved Tent chaos mutation operator, neighborhood centroid opposition-based learning mutation operator, elite opposition-based learning mutation operator, and simulated annealing algorithm combined with other mutation operators, namely, GSSOA, CSSOA, LFSSOA, ITSSOA, ESSOA, NSSOA, and CMSSOA, respectively. Then, the performance of these variants was evaluated on 23 benchmark functions, and the various performances of the best variant were evaluated on a comprehensive set of 43 benchmark problems and three real-world problems compared to other optimizers. Experimental and statistical results demonstrate that the proposed CMSSOA algorithm outperforms other variants of the SSOA algorithm and competitor approaches.