Measurement: Sensors (Dec 2022)
Human activity recognition based on hybrid learning algorithm for wearable sensor data
Abstract
Human Activity Recognition (HAR), based on sensor devices and the Internet of Things (IoT), attracted many researchers since it has diversified applications in health sectors, smart environments, and entertainment. HAR has emerged as one of the important health monitoring applications and it necessitates the constant usage of smartphones, smartwatches, and wearable devices to capture patients' daily activities. To predict multiple human activities, deep learning (DL)-based methods have been successfully applied to time-series data that are generated by smartphones and wearable sensors. Although DL-based approaches were deployed in activity recognition, they still have encountered a few issues when working with time-series data. Those issues could be managed with the proposed methodology. This work proposed a couple of Hybrid Learning Algorithms (HLA) to build comprehensive classification methods for HAR using wearable sensor data. The aim of this work is to make use of the Convolution Memory Fusion Algorithm(CMFA) and Convolution Gated Fusion Algorithm(CGFA) that model learns both local features and long-term and gated-term dependencies in sequential data. Feature extraction has been enhanced with the deployment of various filter sizes. They are used to capture different local temporal dependencies, and thus the enhancement is implemented. This Amalgam Learning Model has been deployed on the WISDM dataset, and the proposed models have achieved 97.76%, 94.98% for smartwatch and smartphone of CMFA, 96.91%, 84.35% for smartwatch and smartphone of CGFA. Experimental results show that these models demonstrated greater accuracy than other existing deep neural network frameworks.