IEEE Access (Jan 2023)

Threshold Binary Grey Wolf Optimizer Based on Multi-Elite Interaction for Feature Selection

  • Hongzhuo Wu,
  • Shiyu Du,
  • Yiming Zhang,
  • Quan Zhang,
  • Kai Duan,
  • Yanru Lin

DOI
https://doi.org/10.1109/ACCESS.2023.3263584
Journal volume & issue
Vol. 11
pp. 34332 – 34348

Abstract

Read online

The traditional grey wolf algorithm is widely used for feature selection. However, within complex feature multi-dimensional problems, the grey wolf algorithm is prone to reach locally optimal solutions and premature convergence. In this paper, a threshold binary grey wolf optimizer based on multi-elite interaction for feature selection (MTBGWO) is proposed. Firstly, the multi-population topology is adopted to enhance the population’s diversity for improving search space utilization. Secondly, an information interaction learning strategy is adopted for the update of sub-population elite wolf position (optimal position) via learning better position from other elite wolves; in order to improve the local exploitation ability of the sub-population. At the same time, the command of $\beta $ and $\delta $ wolves (second and third best positions) for population position updates is removed. Finally, a threshold approach is employed to convert the continuous position of grey wolf individuals into binary one to apply in the feature selection problem. Further, The MTBGWO algorithm proposed in this paper is compared with the traditional binary grey wolf algorithm (BGWO), binary whale algorithm (BWOA), as well as some recently developed novel algorithms to exhibit its superiority and robustness. Totally 16 classification datasets, from the UCI Machine Learning Repository, are chosen for comparison. The Wilcoxon’s rank-sum non-parametric statistical test is carried out at 5% significance level to evaluate whether the results of the proposed algorithms significantly differs from those of the other algorithms. In the experimental results for all datasets, the overall average accuracy of the MTBGWO algorithm is 94.7%, while the highest of the other algorithms is 92.8% and the selected feature subset is 25% of the total dataset. The MTBGWO algorithm selects much smaller subset of features than other algorithms. In terms of computational efficiency, the overall processing time of MTBGWO is 24.2 seconds, whereas HSGW is 44.1 seconds. The results reveal that the MTBGWO has shown its superiority in solving the feature selection problem.

Keywords