IEEE Access (Jan 2022)

An Optimized Continuous Dragonfly Algorithm Using Hill Climbing Local Search to Tackle the Low Exploitation Problem

  • Bibi Aamirah Shafaa Emambocus,
  • Muhammed Basheer Jasser,
  • Angela Amphawan

DOI
https://doi.org/10.1109/ACCESS.2022.3204752
Journal volume & issue
Vol. 10
pp. 95030 – 95045

Abstract

Read online

Optimization problems are usually solved using heuristic algorithms such as swarm intelligence algorithms owing to their ability of providing near-optimal solutions in a feasible amount of time. An example of an optimization problem is the training of artificial neural networks to obtain the most optimal connection weights. Artificial Neural Network (ANN), being the most prominent machine learning algorithm, has a multitude of applications in a myriad of areas. Recently, the use of ANNs has risen exponentially owing to its effective ability of making conclusions based on certain inputs. This ability is primarily achieved during the training phase of the ANN, which is a vital process prior to being able to use the ANN. Gradient descent-based algorithms, which are usually used for the training process, often encounter the problem of local optima, thus being unable to obtain the optimal connection weights of the ANN. Metaheuristic algorithms, including swarm intelligence algorithms, have been found to be a better alternative to train ANNs. The Dragonfly Algorithm (DA) is a swarm intelligence algorithm that has been found to be more effective than multiple swarm intelligence algorithms. However, despite having a good performance, it still suffers from low exploitation. In this paper, we propose to further improve the performance of DA by using hill climbing as a local search technique so as to enhance its low exploitation. The optimized DA algorithm is then used for training artificial neural networks which are employed for classification problems. Based on the experimental results, the optimized DA algorithm has higher effectiveness than the original DA and some other swarm intelligence algorithms as the ANNs trained by the optimized DA have a lower root mean squared error and a higher classification accuracy.

Keywords