Mathematics (Oct 2021)

Performance of a Novel Chaotic Firefly Algorithm with Enhanced Exploration for Tackling Global Optimization Problems: Application for Dropout Regularization

  • Nebojsa Bacanin,
  • Ruxandra Stoean,
  • Miodrag Zivkovic,
  • Aleksandar Petrovic,
  • Tarik A. Rashid,
  • Timea Bezdan

DOI
https://doi.org/10.3390/math9212705
Journal volume & issue
Vol. 9, no. 21
p. 2705

Abstract

Read online

Swarm intelligence techniques have been created to respond to theoretical and practical global optimization problems. This paper puts forward an enhanced version of the firefly algorithm that corrects the acknowledged drawbacks of the original method, by an explicit exploration mechanism and a chaotic local search strategy. The resulting augmented approach was theoretically tested on two sets of bound-constrained benchmark functions from the CEC suites and practically validated for automatically selecting the optimal dropout rate for the regularization of deep neural networks. Despite their successful applications in a wide spectrum of different fields, one important problem that deep learning algorithms face is overfitting. The traditional way of preventing overfitting is to apply regularization; the first option in this sense is the choice of an adequate value for the dropout parameter. In order to demonstrate its ability in finding an optimal dropout rate, the boosted version of the firefly algorithm has been validated for the deep learning subfield of convolutional neural networks, with respect to five standard benchmark datasets for image processing: MNIST, Fashion-MNIST, Semeion, USPS and CIFAR-10. The performance of the proposed approach in both types of experiments was compared with other recent state-of-the-art methods. To prove that there are significant improvements in results, statistical tests were conducted. Based on the experimental data, it can be concluded that the proposed algorithm clearly outperforms other approaches.

Keywords