IEEE Access (Jan 2021)

Incorporating Structural Plasticity Approaches in Spiking Neural Networks for EEG Modelling

  • Mahima Milinda Alwis Weerasinghe,
  • Josafath I. Espinosa-Ramos,
  • Grace Y. Wang,
  • Dave Parry

DOI
https://doi.org/10.1109/ACCESS.2021.3099492
Journal volume & issue
Vol. 9
pp. 117338 – 117348

Abstract

Read online

Structural Plasticity (SP) in the brain is a process that allows structural neuronal changes, in response to learning. Spiking Neural Networks (SNN) are an emerging form of artificial neural networks that use brain-inspired techniques to learn. However, the application of SP in SNNs, its impact on overall learning, and network behaviour is rarely explored. In the present study, we use an SNN with a single hidden layer, to apply SP in classifying Electroencephalography (EEG) signals of two publicly available datasets. We considered classification accuracy as the learning capability and applied metaheuristics to derive the optimised number of neurons for the hidden layer along with other hyperparameters of the network. The optimised structure was then compared with overgrown and undergrown structures to compare the accuracy, stability, and behaviour of the network properties. Networks with SP yielded ~94% and ~92% accuracies in classifying wrist positions and mental states(stressed vs relaxed) respectively. The same SNN developed for mental stress classification produced ~77% and ~73% accuracies in classifying arousal and valence states. Moreover, the networks with SP demonstrated superior performance stability during iterative random initiations. Interestingly, these networks had a smaller number of inactive neurons and a preference for lowered neuron firing thresholds. This research highlights the importance of systematically selecting the hidden layer neurons over arbitrary settings, particularly for SNNs using Spike Time Dependent Plasticity learning and provides potential findings that may lead to the development of SP learning algorithms for SNNs.

Keywords