IEEE Access (Jan 2018)

Long Short Term Memory Hyperparameter Optimization for a Neural Network Based Emotion Recognition Framework

  • Bahareh Nakisa,
  • Mohammad Naim Rastgoo,
  • Andry Rakotonirainy,
  • Frederic Maire,
  • Vinod Chandran

DOI
https://doi.org/10.1109/ACCESS.2018.2868361
Journal volume & issue
Vol. 6
pp. 49325 – 49338

Abstract

Read online

Recently, emotion recognition using low-cost wearable sensors based on electroencephalogram and blood volume pulse has received much attention. Long short-term memory (LSTM) networks, a special type of recurrent neural networks, have been applied successfully to emotion classification. However, the performance of these sequence classifiers depends heavily on their hyperparameter values, and it is important to adopt an efficient method to ensure the optimal values. To address this problem, we propose a new framework to automatically optimize LSTM hyperparameters using differential evolution (DE). This is the first systematic study of hyperparameter optimization in the context of emotion classification. In this paper, we evaluate and compare the proposed framework with other state-of-the-art hyperparameter optimization methods (particle swarm optimization, simulated annealing, random search, and tree of Parzen estimators) using a new dataset collected from wearable sensors. Experimental results demonstrate that optimizing LSTM hyperparameters significantly improve the recognition rate of four-quadrant dimensional emotions with a 14% increase in accuracy. The best model based on the optimized LSTM classifier using the DE algorithm achieved 77.68% accuracy. The results also showed that evolutionary computation algorithms, particularly DE, are competitive for ensuring optimized LSTM hyperparameter values. Although DE algorithm is computationally expensive, it is less complex and offers higher diversity in finding optimal solutions.

Keywords