Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska (Jun 2024)

IMPROVING PARAMETERS OF V-SUPPORT VECTOR REGRESSION WITH FEATURE SELECTION IN PARALLEL BY USING QUASI-OPPOSITIONAL AND HARRIS HAWKS OPTIMIZATION ALGORITHM

  • Omar Mohammed Ismael,
  • Omar Saber Qasim,
  • Zakariya Yahya Algamal

DOI
https://doi.org/10.35784/iapgos.5729
Journal volume & issue
Vol. 14, no. 2

Abstract

Read online

Numerous real-world problems have been addressed using support vector regression, particularly v-support vector regression (v-SVR), but some parameters need to be manually changed. Furthermore, v-SVR does not support feature selection. Techniques inspired from nature were used to identify features and hyperparameter estimation. The quasi-oppositional Harris hawks optimization method (QOBL-HHOA) is introduced in this research to embedding the feature selection and optimize the hyper-parameter of the v-SVR at a same time. Results from experiments performed using four datasets. It has been demonstrated that, in terms of prediction, the number of features that may be chosen, and execution time, the suggested algorithm performs better than cross-validation and grid search methods. When compared to other nature-inspired algorithms, the experimental results of the QOBL-HHOA show its efficacy in improving prediction accuracy and processing time. It demonstrates QOBL-ability as well. By searching for the optimal hyper-parameter values, HHOAs can locate the features that are most helpful for prediction tasks. As a result, the QOBL-HHOA algorithm may be more appropriate than other algorithms for identifying the data link between the features of the input and the desired variable. Whereas, the numerical results showed superiority this method on these methods, for example, mean square error of QOBL-HHOA method results (2.05E-07) with influenza neuraminidase data set was the better than the others. For making predictions in other real-world situations, this is incredibly helpful.

Keywords